var/home/core/zuul-output/0000755000175000017500000000000015145323331014525 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145326654015504 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000226120115145326477020270 0ustar corecore?ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB >Eڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W5!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kf^?·0* TQ0Z%bb oHIl.f/M1FJdl!و4Gf#C2lIw]BPIjfkAubTI *JB4?PxQs# `LK3@g(C U {oLtiGgz֝$,z'vǛVB} eRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/X_]F@?qr7@sON_}ۿ릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxt _d2J0BLzv8D<%P\MUfN$68X8ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$+mj(^>c/"ɭex^k$# $V :]PGszyPE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэo!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.D>)d_ 8rcu,wf2?Ǡ*_lDn}rauyFp*ɨ:UiM2r:9ct X1lmĪ o玓,R%!`hGT LYF#g<cm${|Xdu4tmtїUJ\~dc0KcMlf2?mμQ ߉J4WrSHTdp"ӹ'cJq2zPlX̯.0H!ND@UapVoGڧD5>H]f@!=߸2V%Z 0"G4ȇʩ@]>Y$ًF_Mm_Tt)ib+q&EXFu򾬳ǝ/RS>r,C2NfOjpcm{Ll9vQOT>9U;])>6JdbXԠ `Z#_+D[7IIjJɟUh ҙ"`"a ߒ"G̾H`6yiCk(OA/$ ^%K^+(Vr[RR1"u4A.1X0=7f/"(o9/L1X{]q`Ȝ/; 9a>E)XOS K9mUxBa"'4T[Jl /K/9,rlCAj_TiǘP,:4F%_0E5IE'rX-|_W8ʐ/=ӹjhO%>| :S Px„*3_y.g9| ;b`w NtZtc> ײ1KĴ{3Gl& KT1ZWX8?C]~We$9; -.D087?1a@P5B,c}jcGȱ WW/ @a#LA4.ٹ^XڋXٝ:^Izq. ٽƎDn6ٹBc5Lt;3#i3RAٽ9| cbpcTfp> 6L/_x 'ۙz7~w~);qU9GDT! 6]c_:VlnEUdn6UˇKU;V`JUݵޙEO[)ܶCy*8¢/[cչjx&? ՃJȚ9!j[~[' "ssTV2i sLq>z@JM->=@NỲ\쀜*/) ̞r21.y? bO]3?C!yw3ޯL_Su>o>&lrw&i"< :]_<<7U_~z5є/rfn͝MLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uOq?u-|?WS_tOq?Eu-L_p?Cz .e ϿO*3 `Ђ6a-`kIf-s,RL-R`1eL~dپ&+IhYRczr?㐟,v~,b6)up)3K,RLW"Qd9JgT\1f3@Kh% a4x,kA k ^d kYj5Ah𚄓vXZhX1xҖ51Y +Id ZZ\C| fD>hB֡#-$+Jpሟ,Cg:6 3 xH "}C[`ӨOAFn5ʬLHϰ:N@VcyBI#Dr. "h hg ۃm-qu>V&൘ G7qi#^tҒ[JI!{q*lrD܇Gk@;oI<5xZ4xM"؇'k!>V|lk'{d+ :sXӄc)?W`*|\v aVT0"tMًcΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[$4pjfљ lݍ3)`xvcZRT\%fNV Q)nsX }plMa~;Wi+f{v%Ζ/K 8WPll{f_WJ|8(A ä>nl"jF;/-R9~ {^'##AA:s`uih F% [U۴"qkjXS~+(f?TT)*qy+QR"tJ8۷)'3J1>pnVGITq3J&J0CQ v&P_񾅶X/)T/ϧ+GJzApU]<:Yn\~%&58IS)`0効<9ViCbw!bX%E+o*ƾtNU*v-zߞϢ +4 {e6J697@28MZXc Ub+A_Aܲ'SoO1ۀS`*f'r[8ݝYvjҩJ;}]|Bޙǖߔ 3\ a-`slԵ怕e7ːزoW|A\Qu&'9~ l|`pΕ [Q =r#vQu0 M.1%]vRat'IIc(Irw~Z"+A<sX4*X FVGA<^^7 vq&EwQű:؁6y\QbR9GuB/S5^fa;N(hz)}_vq@nu@$_DVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ&!jC`@ qэ-V Rt2m%K6dX)"]lj齔{oY:8VmS!:Wh#O0} :OVGL.xllT_oqqqLec2p;Ndck[ Rh6T#0H Q}ppS@ώ@#gƖ8sѹ e^ CZLu+."T#yrHhlكʼE-X'I^=bKߙԘ1"+< gb`[c1髰?(o$[eR6uOœ-m~)-&>883\6y 8V -qrG]~.3jsqY~ sjZ+9[rAJsT=~#02ݬf¸9Xe>sY~ ae9} x* zjC.5Wg󵸊y!1U:pU!ƔCm-7^w]斻~[hW$k sE0ڊSq:+EKٕ|dvvjjy6 æ/ML-yz,ZlQ^oAn-})xǺǍ--qcl:WLg ӁvJ[ǧc~Of+8qpçco#rCtKӫce0!Y-+cxMK-H_2:Uu*corD~@N`#m~R:ߙ歼!IZ5>H;0ޤ:\Tq]_\_>e˲\oUQ\Wߋ47WwߋKpwSSۘF,nC.\UߋoVEuY]^VW0R=<ު˜˻ x}[ێ'|;c^ M7 >5\-> m-8NJ\ALd!>_:h/NAC;?_ξqĎ6xMY(=ͯl~l8V0٨T zL{Ac:&$ ^CpH*DW\r2aR|=(L X1|wrO_g ux1^^V2޲jMi^b``Q#dBxV#NBk1;DAV$"*1]Y~ d->ػq%WT_Bor&N2uTT5Erc~v,;v 9b&$zl@k$1ZɥJIR$i\&Ug]OX&S%`6E+*I~JǪ7#BJ&免(Jݕ66BcUJ\.4F(æNWqVܞ$_~C,o"V Z'Gk)X2nw;V3-uTHǪ 7\72Tq Lͱ@ƪ.0L]b|-X ^pzio3r3ִX ;$z_i\s9q_U<ÁiaǶ]RvIي7n86.VdYs>Z{)yʳ TFۖܣy# רRnXkT##iȑ~ZHX?ivg,ɸI{E^3f#cW7Y}9v_`XَzF:yg|V,(sO7u2O9y礧SY'e$>4N˔TfwzZ2_D ~ϦS۴s>s?7Mb.k%GE3[uWX#Pc47y]۷@7bDsL{Co߈ym1 & ZB򽚱'ـJD$2Bԋd/U];fpOiZeRzj(TUgV/bƘ)׊,;dtMq\ϗk1,Ox U3nǑř3it7Cn9:O&|3RTdǭ4D: %UW( 0q//xyU8;@]*X(Ay2T&<E^BT JJgTBb UNEޤ >5A^&bR/ԤO.&U:?w"pNΔ,꺨^̣jVꢚs[Ru^y:y9Y\L ZxSAeܐR]l=,+? Yw;Brd6ު}J$MQZ &PB1%D Nү4 Uv'_r1 v1bCϓs -tr\!n29H IG>>G_"1io|ogIw`:_ږer-%h NQĭ G&4fy%bVJθ.:n9~t]{N]jf8I<rr2n 2IH3J拺 mwV)/)||ڙϿ`:+$3[7Y& ЪD۝Iɫ2Zc4}L,Ngkp?,U ! /`:^ݡ3, SeBAΝݟ;Y]*<^⡒\* }f? UV>*//H)ݵLoO9`-WtIIts:=>I%Yߙ ޔ !Nu$_)HTS _9L&f-BwsD9ޑEy2ܝoGG#Mћa3/<fny{k{K0հ0gw*U9[%Q2M$:T{*`q$=Bsc'OC_y#4"=آdOT`@ګ|笑žM-xΚgzq@E]}|BC}JNb?`N_N,ݓFc4 E(V!Ѡ ֎)O*a'Qr}Gqի+ T;=oQa۶Jtj F"-P 8SB k,REvp<k&a >3"7BT$RI *t́ ,1;Y/6#[g|p٥ݜ jH!%فكG#nSX#"$˔l4#{ 8/uC$12Mbab*S&Kÿ5ͅ<,& ;*Oi਻,LEM "0}~^SX>EmedbDwsS8BE,=ކ^c`9оaq౟$ܴaYh/ <:2+oK-u1VwxB3<0E"_rYTfO{$$@ ~c*42TU!" '(ԅ)}$̊hTpՙiԿh(G_iOą8e-+ˮT]sysx!qE+T#o32So"G#k';h0Ѳqռ! C*kUcV (c! D-o?`m㼌Alfy!kxeQruuؔ=,"o6}HX4i}Hwe `Iy kZ҆j[: שHQ:g|[`&)Q"~`Kxou{ؽi!:|pXl_78UidZKe2Sȴw:s$0=Of"I.AʷADalɫK9%%KJT-(VόT5*C& .0c7P@.A)8Tsu@B$%זI5Xb DΉ3iE[L,;hx \WYZ}l_C<.V07CL><<>Duu^3CښƈycTdz⑹+!um瑱H ^sSp:GN;4:EϷ)FjAʇR-9Ĥ!4m'deր?i7=LN7!2k^C˞iV5ӹwzos9ԥW)L3OgH֗uETu!K{:BDI}c$U4ٝt֤ u[1-Xa<qTM_A`)En\yC5#~y1seݻq$;w}=unl+/ֶSMh]H&\*1 Ad6D"Y62%Z@[X-iwKaPhȴ{Kv h)},".)Pm6ZL٠;nm_h,t; ˛j5cˏ~Rb:wdŠdtRbN峩eS6\9mAۻ+/<7nOQa\SX]wtm Xr7׋m+XX;Oq ~i.~ko^lKk" 7-֬Vp֢z}%ի$\Q :Hjl7un2,٘EKb8dm-Mm"Z涸*_EIdgo5 s4<K@ʟ 1-i<؇v;n@w}43Mߗn@cбhCSGPD0`'F^@{ x|]fR hr8֞t._ҞW T܃{ށ7"r̴ً -Sj$}lIDi] gࠍ=#x1XTC<>N"vlYѺP0~?޾DbCH Xk $}J !U0PCF ;\wJ^BvvЦ) RPZ8w纡_&{DA7]怌I9B HjBs|KCy{Ւ r*" i4H%tYWq#I nM&8H<A͌DHɏ3oUwlّn0~|UzPZ4ZC5LM=-Vھ76UY DCEm)WYh(h\y4pxAJ>9 1W:܁h&7,`OݧT: 4i&EUeQkCJ`49;Dd,LZ7TQDuzԟu|I?RLpt_ק2k/F@akT\#jJ:?Z*{ 'UIT& .fw^Iso'Q%gdN;`F('^|Y5b/w-8-w(&XA}xkϣ9>*x懽rs44// ,s"B z Xߗ;=ý) Wޟ]NO|<gP4n<km`qJ?_9vy|_O`=BHkDɜЀBhxqٽLD݇;E;\wh퇯>@P;u(^_a9d}#/IFϨ3YΫF]47rXBqN;=|E(^OgCo~* 'A c_~6$@Wx<G7@* x4n_ KrXO+ 1k]Kh\VCԍ5`uC^_\sw[ 0,Yv.].̊oAp' z/`h\lv 4N3 =;G οyzÇ(:h$Ѕ^JІszT~]q{0~mCX\31?=9 jȠ۹< = ߮Δ۽ `_pVϦ$~όs_+VOE<ƾ=7=Tϼ\Rw^Гi5[5 } CWL'"5TIƹ֥Mw~y3_P2ngXwF"^2_#mKͰ;`FuD,p1^xhwv z8nO$0HohxWwO}_}p 4K_*|$NUN*SSNx-7^:^ӹg:Nğ w~N^ ra6Ɩ3J{x ru_ _3 W9v/9syOrq/LX15:|⩔u(v+p<4a F!YE|G Q %TsheLؙ$2Y9iY%b!$Wt0BfhSyZ4/ֹF W8 (b̢$㲘cxI6Tٗ݌X:SJ?-pg0]m['/b+eD%2go @Ny D*M 2).KJ>ٌ<SV!CrH)=fffx۝u'-<*PdDX8nSKqPUbרnA&:HJq<9;;<Et:Ds72m7Ȋ\㑟b7jxDvs0d mR* T%%gΛe oDdFPǁ{IdDA^tb5ŭ:aEB 7>\ ~rMQzF!(CJY?WB/Og㊌@.!JV[ByQ5vқ{b׭<k>.;s {A; 8aP\\xv?0j6 ;(|mUn` eQ?1sF>'yyqS1 LiǰflQMX^C4mĮ68Q3k:. kGjbuWpO`[ ǨRPԝ4;t'+}fL}`r?.IE\DAxO=D$Qֲ3u}7p+T;Z 5`qZOt[=[4"g3ko6btE37pT=; ardgUl-e,@ "kI47*W^ GZy 5_̡qUhMb:N a]k/i`mҫt UUAݎp?]uЪ uNd(^m;pWk=/)N?%a~6ߣΨqV;ϋsg㩸ԃe*##XI|QodRI_Iz%x_үHW@kHW ^F9<%,\D«^&&1P])< GY~Ɠ(J:WC =@&jߖ6l AX;e³!|k6|1`շf9?}kY8>>Cg.%HN謸F/ރYq-*Bc ?TN=#(NMIq_GeqmzZ1ڨƱVX6=fzmVhJ=HAP3qt8vzKqwQ>c\us^ՎUf $Gxנ` Z8)APcoxU|y" ]ieUXi(+րV-s9 cH^#x޺#חiB0 #`1^lVCeQlV4 ߿ 1T=-jVQV^س[ťp{w" =Dtl3KvE݄E"1*]h3R:/U5-a%SҤQx0k=< qbˡ^aM57䱲r#?dir.-"4Is|WjZro@[١a @igvtꟵWpȒs!%\'hS͸s\! rN)]A7BAex }>cn.%ܷ[~]߾6߃?d׾j? yN 'd%Nyz]mIUwXPZZĘiLn?Q+{lM?z^xF[u9m.YC@VsVچqQI9d$fip33EC=ѴW,P5L钡kU4u:'jF˞ҥ7IG "ǯH}{qhڭ i JsbiJ1µVs p-DzTEGc$5 >WL!ډP;hV}#/(h`,_y6oL s_גm(N8)˓&`(~ K-흓]!yh~|Qkh]̓_][o[Ir+}v}C0O$L5$- S}H⬧f9ݛyXtWO(O#~S MQHֶ^j(RGmh?_h(~be:' \Z?x2WyQ#Ȟ~W:Yo]Mxfo%cP*k{k:<z_=G^&TEIUdR*m2W9-:/jɏwM Om?q bDf!f@q7 hfj2Q9PHӈ{8<JKMB1Cڃ<9 GSIx .ɱEm/eWUbqY%Bv}v4 I*9h 8kV*)#w쇆m#E4J]TcM$R-(Ĥ0sA+U1cDP1p] cor: KއVhP)b" VO%la.Hn܇ϡQf-p^EOzNj.XVԕVsHcTT)3m4&60N?(BBm7/gpAI@1(1[-xweHŻ>pp1i5D,`2MR$ %쿢ثG7]iOk2n"9scm'rҴ*b$fg::3Z΄D.MkǨ1CMtUVt`j;8Sc#R%P}CWF悭qNlD"HшؓhB N4'DIFm$ s2-D|QzF{4[Ltq.8.DεP)`\#lAc|+E%sE9H@*%DSێ@-N`7Xk=£9!qt܆˶W?lvC'X'Q!M^;ŋCgjrA҅yYvQ[fs\h7cd1Unޝblt׹b ҿ]g̢$ C2,+W1:ƹhZ+Ld#~Ht9鷹Ec]gmc#Ji7QJS ALO*<<.qjw`&Eќ8*8u6W #&բz?qv!N;ú`-݃{C3cC:#} YH[1H A_.8_[y?ERpzQ؍Ҋy~`G`*wʢBD )B$Alk*``6)u3>փ)U+YxS R0,xMP$U. CmAʰ02,ELr]c}c8ݱSbr>UP3 +hĢrfɜI).8&u?eP(Tμ:Μlsf;Q 2z^D;IctэlaNι̭lѲSgl!KQ^^5E#,髇=OysEp4l'>(rO8#7*{R:nJb@~ʾB]PbFG)ѕ.s/bU38"Ac-\Tlqr`UwqxJY&e+CJT6HO8::mʻ.`;حm->u5OPwup/ӵ,'wr"-AΦľBe,8`ڱ5 t$z'e&U@(厷' ʮ{#m^&;%*RR{_ e^b6SZ>99zD2Et[KEK&cWP7G/|*FdKHy]grkIgkm~yA۶,eqIiYL+(7O߻ 8g&B" oByWB/jJcWRU2OZJ >ٴeI= Frvbcݘ0=9eTdwrb_S$!)n5hu-idB}kh͟e vL-* v0$X 64D,Λ')Z.qxI]sn> \rwr)v5`B|C 7z­ 9vs,q4E҅wzGo~f:ϢQPf )qV38Xv#i"wl1k^!. )sM?)-ݢd\ of393"+`Gc2h)P 0;(.^tqtە\+3)\+gezQl!omny[FW2_0[7w>Gq+NUq`榩JGtcgZ0gDSjlvZA E.|3*zkA>xhDRriYP㇨$Y )Lߖtjek9Z]lN᩟cͅX.+uǤa4m^- @F|>ШP3*3e4I8_mMx4[♷b;A\_ 1Z.6# n*7y|| ݮ= uaYwx ˨AmFNq g֜ضJ4$J``} s/!~gja9.ά`mdQqZتȂ;Fd=H0[%(|Q= 3)= ]7-tT ?zq"'. '4xa |bR_ُ)$H7#*_"bPVD;ʰf(X]p7`}ke1*C-ϜDJuѹE9ۧ;^! m"<}]ؖ=I 93BemژQPtV+~JtX'W [=N^x] ){/]Jjpwp:^:1cӕR(U;{ҤJ%]҂]TpPH^GuN[γ]LrfvQs%G4T2ćlY#D!eDN.8탛qPrqhO0 fPVǞq$6[{R2֞-}XH@3.f0=%:3.eB`}a fSkS<'j">E/5TMxR6<=Z8hvC&6Q$]j;>[S?#I+K"BؾqYCu S"ҎvgFPLRCRIgꪯIzG?ВҰsЄꩉKZσ㳮X\;KU=7\؊Rbjíl"v3ÒEy|VSS[7*/dsV"@cv-rcI^[X_'ɭGs1NިdFZk\G,ü8]w SD@Ϳ|O1dBm9Qxf.МURR\entq9 |2N@i"}&)Yz9Mah ZէT.L;?!hĤ*1elOOa8.|YK{w`c׫n9O1.-#\Wnlռ-ߍ^ş'cxb:ĕx}FZ4i“|u%+9qo8v3zM"f$d1*UX_ښӋpQڇTa> 7xa ^>j,,ay4t[䗓d&l<dFۓOD iqD+T}hNKlj7^ߜ}'Y}b,nQ+kpJ(*%.@J 9:G (QSfBY^ gf۬M3rq0>KUhr_gWp N7>.SMPzq&rBBn?{ےys#TQ*w˩$̛5uīþuIâ$sO-3gTƣ&6w1.d@qmNKr fW14'3>Vnvir[&8 VVnOKz9Eok_!õw 5;چu7Kw!xqaj_&b&J6nvɱ)GmV =2^V /Tc?{K<29O+C9g@5rG|~虇l|t߅.;p=F~3NUؚF?&`G@`YR(-BtJi]RZbi(%;D3!%(-E'@iisS( Q9f4F0 0{*[}r$TsVm˺綒(̻N(9C ʎ90鐟t(`}ava{Wh&wthHvJ0yPc yALj-mPuI4P[vkO z%+$CQA)hǴANiEVcڠQ_@\p+OR\4sEes9VP@ )BʋI}9 qH,Uv: nTCH\ʇ'jr-G1FE|2B-^(Bޱԍ..̍jQ@77MIq^4%^Ur h5oػ+M}e~ pu~#'kj#TY4Ӄb+F<߾N.+=8+unoNIkWi Jj'9"[zJ640;8e|nY{"Q`fCI C8 \pe 7.5Kmjzyl@{oDaU:8Q,| Ei]npRbTxz5vjũr Jj94n5 cTQr/7|a_Wr%p3Lmx+=}$/U{"5!5VxI P-~hN°aO vlxj__mgٚ[mЗd5 Vngɖu#In<%mFj[#/هk6ͮg1cdxfɷ^x;t [ٻWccBlj͘ E+N}l| 4#?乤&h[eI6ElX__MY>#-brOB;\гdd:h0璨?sP|'qEgJ*y\ J_k4暫n (&Bu&) "Z=}S$qwvhpظdE0mtt}q&2QVrtTa1]NEy~R2zZ?\T52^9ZVA|,"TkoӾ KG_!Ẏ~ζ' Sbik)0R\_ܛJKב֕?X3$Ba[3e75zʬI\RoT9kvuFhC$5mE4m-4aF3 ik,uѦ05S0t-07Q:9*wlX`OWm+j-+0exۯvۯj1^ pNmM⮰IZ.YvVQkPK5ؐ`AҁQ:||0,5X?äE,Yp!a߇J 0F)Q07"zT7 9F-@]7 @rsUj9D Z3'Y1_E.oЃpTR7FXݙ1"?LrzL)d%arF>!`(W&\˩~+wqT`w$$*A/d7 J9x3 JO9iڐ$J`^"wW:Y^dq 9#7ه__}i\>CP>ŇX oE"_nK] ϣeo~_z4Sx?_:%5_M•_J[Mzl`% [ԭ <6!( i=QƄ@Xenࠋ^F`TR6 Wod>x%8eNJxq"PP"-!P%~/" kkзYo~ngeþ@vtnQk Qk"9(Y{ rRSaeORi$DݺcQ=u;p).;ti ߞ`^ލaGع`>>FCs=I~0)y e\R9a9(<+q?vnJ?|(8I4~ٙuLQ֣ά%0FzGN\OZ8;1QC=;WlƯ~^ 2(E;~ F p}WP.;~{+A}{7G*U~)_&eJ,Vm)%r\:׽9״RdE~Rgdq I%5 t.*.m\C4-:+&S8,!)C0> |xdt(qSK4p(TbMFuМq `qzN^%ڼZ`KDo=19ۺʝ.3'̙W!yiy3[:E9ošf ) 3C]Dc8 qMcoq&IG'xMJk)-1IA"R[0e8 0N460mG"wM[TO.GY'),=J eZ&Fi$ꈱ"-UsTĎ*5 mil i4%m#⅓g)?єN9Dt;\:~@FBYp!W!PA5?VL\H%`qLXz!<%T|/S{OqJٻ$東w Fw ,Mew|2Q cDV:$xi vTlށ]{;oZ#Nt &%ZwSJPL0mn&J1m\tIco'VV0qd@{C3X >$C9Ŋ)ޔ*-_߬F/opN0Kv w%)觻 48' ";)s]+%4p[4ȉQ\Lg-*ZaHy#INV( 8G9gDJz)F i^v]iBqOShf,(RG% S1 T3(f ()wl+f '"i@j1ك֏Z6O2!^E'M\Art+` 瑈Fa[-ڀ :92V&k:`$S&@b:d$^-Mv@iFG93p]Ö ":gIL1NVFJ+, @"Q pܚH!B9uNZ#=L gm~hPCܻ$7m$.IP1%W8ECAɒM(zMgΙ9s3ghf (|QWѓe> |>!0Zj组N9ڴʞB&\%hFplʋЖT`qhو}AHy!j$]9LVKj$Rj7IVlGcr!YWKɔJL^)7cZIjD8A%Ș0 <$ DŽ&[X4}<S-uq0#&U"D,B0N%`"&.Aā09}Դ|Z ;Kdj*lLa AP0a"\|  F_ `-E ~5pUyù}"ؚ0A(B-XN `FT(B %{Bз%!3Wy2>o766X6EMIZ@Z4vDZ%\^? }}]2u? V/n. tI,#'~8o_YJ4ǻ 9Ic.ڋ·Y-[v> U~ K |nRa}il׭B)m$`ik0۰kZh#wϩ4UY7`ʝW~}m:W7{3`%eB}Ig 񖂜:V)Ֆo0<~"Z>X^jbݳ%KVVmy !&%ݼŜd筣 |3g@Юί,vw1wcI8c7{ot.h<R`>Q&|3yz}fCc}znܤ({[9eRϻQf'-Amf&; #z-'p &,4ꗆ8Aޅw yǃҧ\CZ}ўаef`Y2R.SzKx3N@;.֞mB9~#xLݶI1NKö}obHDnWY_PtgIsv҉꣗k{}U ƪ#\p++=\Z-{PCm/-GaWHZ}c3mm̙m s@pEkV2[ {I%T-6K x]Bo]2Hsf0b4̓]Uj[Fl>BOsdq l^>Dz J{;q _tTWfl 4#jc ̛>/G ~钣O&*pi o\=J֗~y @~4_3IGTyyVkOtS U߯߾hZEX } fC0yll|*^84Ĭb$%w(L +0ezVgD',<ף/-} }d!C&gD<\FDA,l:NXR!dN<b O_ b)΋x4q_*ň.\WWeVη#}0e 0ot2r~,]0HqRl8qͮbıtr^n@}& +kvE}@,@0F)+Ɵt5@>gXſk1,XB { ۪Lf[IKe/ Ea>P?HZ(k%Av i]$&<>`#:u R&|@2Q|5nR)`Vvw7TomG'F o+"C50)WvscI+\/pULO~w'}>ݘd"umG^H>},ΣWba0[b "ѳ)H\Q2Lލ\j^}To f@َ"`?za Em#_ʯӤ|hЌn55}|#C!HH[:% 9)XJ(!D$jT8W7ǖE[?͜0moSzpI\MbO`*G1ɋΗ:k^qq6 3b$o^p}Q cr':zo]2x#wGz*r{$A b8&h͙Y5_e.{䠢]0YH$zvFoG(nASɧwoӬ"ZxSфd !{94'=%Bm1h[M'?ԻY ě7Gz\Qk sľR1 de"u-z$}‚7[xsdhaiIuXS^`T^gxƸܐX 3!>e3Zs^6߮๐Z8z\9$&&N x T|S%a[r_|cyÝkXwndp;~yy`gOf81_gK]Nuづ(>eWn9qɚe4oFa0ļ]Mw\po3Lv<aaN?NQC/nTK>ߜ>ԡ/+\^~T0o^-YCjKx4|`\g& 4{< [W+ؗx632tpe)C u1 w=okr!oFkh|3*.?9dWNgűrBgfYT0W8Ft}Vv!^,Z~aw0'nI:陵*%\ hXA}h,ʯa>1l1/lkjV2=:AeB},rN)Q75#|WQ0Nƅ7/Ui\KľW2{Yt0]оՂUxxÊr8!{}ԛ@DV~' UDBao6 >0iåD.&!ul"JDZFD4%, 5Ǵ҄'p6 *NtEz)]_S BϏMI7Al!3l?a0M=XmլL$= l'aWG+JmPָ(VXRCN*Ec#,JrkI|Vj*k;) 80]g-־9ʅ'#{i*;Sqͧ +!ϚvMQ?ϊ9xc9x>E>|̣5(460H?t&C%0iNYyR>%=T sVvzVJڎ<5mg=;hq< *9zl0L > ULLU+)8 r7ge|Ԟء@<^MmASZ$\Eq,KHLc<, Zs&) !T G%AtNܪ-NNC]oFlS}̎m˱c+X)CKz~OekeH=h{1MQl@(*eik]lm5ڧB% _NSc^Vdwό,X<J03_F#򈈌bl%xqGR~Y=Wubqc?ANB5hD9 >V\9Dn|#J!"F<[))PKn44"NJ\b"<8ȕsX+W#FU!bVRC05G#ra"T&.&kԣ}9֎ '^5DzECҘ !&д߭5>R+z]|/ 8ᗿDl$/֯~#9 -_=`wl`a}1C_*Gϋ,ڇ -3>UyN~A>%KleF?;B?6ڼkF7koF!`FϸF)d6(8}L ,FZԵ@qL&<Yu;u.\H.S‚BĹE}Ct$8=eYF+17~NXD.E6*EnD 1 >RĴMlZkz,{໭±Aྭ;۔ 5:Ǭ~~?G:`Iml =] vNCs>slφYd NH2#]|t'BQEIqP2 &O0%UQ.dhG0H*L'ˆ&&HS= 2N+jR|<-VY}=#wDpɩ&؇m 3>l\,6(&&Wvڪ1!zFw7p|Z輠 lW.PP߇ EqtGx&>*˭S~r4Ƴl,Wmw"-+J,fy6B`HZD+y"AyY>MfV.'7knйcΧҏvu7yo{wp7"8Fd+ S2}cRAR|'< ]>T(Ey];8ZQ,G4o.:s^+%"c @+t| .M ed^_mDe֘s"!9'89K6R эTcENekXnN0qIE2Ľ5Fp)$Jv4BtD W7v~$vBqUX.H'.j 8m@&ǮE˅ܕD<[RS^-q̍,O~)t@|jwm V?t֥/nY 2AyZ =U{&ZthZRNK ;;DsƂ/X` 5 ؕe7=%Smq:U1'$ Pߛ+>0W -h.8F7N.Ʌg"Q:a"3^+cz|̣sz5q01K5 2=;%>Q6z]CQ*q#E2'Xex#daBrڴ }٥>빬x&p!Wȇlp3!0<7l| )bv{qPcp{kʻs̰rv`?F7.. `QM,+f;. DN F&[E:b>4CL "ǎ)B!|tˀ|[yO%g.V(XRmFV? xq-16)0y 3|}%3my#T&έm8e r˦ "FDg)rz&x"+rJJ65UϸaQɾ>l2-=M,F))JV8ϑ V@Gg7Bz`𱫇MSRk{Oqܷ~zj h²yoxfsLzZ/rsP=l|0i){dה8p'f-dP7WSGѳz/г7ldRR)ڃCTs {ˇ)0$A{so'tOH\oVLԌ.;J M7Q1e>H!T *0SE13y-nUHSRd[: 0+[|lE[c s Ul tk_3Jhc>V.j 1s34)e\u=a^ǫתV\1ᬜWne@+ #mmӢp?@H$ZUFtUdrHa4F44 |PxV}S/Ծ&.s];]BKg$wɈwWU!^pf9*>1$!b!s(| Lj @{1佛j↉r.ǀ7rjJеXo^M,+XtAKW]o+BP[n]岭5q8|hk&KPp<3޽_/;剂7pFT תgoÔH w!Y&бPzn#W$PVOVJMW*#)T>kCA;-;e?l˟y[6_>7/vopf3εL<%o_k}>82+ %L6M>#8<h3q48 XV@J5m3&A X;~H|Puf݀BU3Ώb"އ.hIU!LG $ XrG#ra"f_=0e Iϙ7Q}BQDFzx@;A"?(WQRE GhB(~:/ЃXr{'&1 <)ͷ`#^VvZ8H8ch|Ү']'rcP3(-cXF){ǚYdBS+#x/.F#K7گ/tp!h!ƙ #r)VOJQ!|As@s+l5k !W=ޏJIRӇ jYV3.:-167HQdWBmpWEJIaO t|}SqC.S(F=B55DS 0{c:an^C#dT%FFqHhU o7'הj>WmM%h4h8C {]~nك- SYη?طMQ}Q[ԟq Ba6={t_=|Q8"VȦ\Ѕ>lͧBf5Zz]/+ }۴0,X{Ft^(E's~+[-BߧhgJS=*2`=<; ,>ߩOncehӉ<>緸Z-T v_vk|',l CPC3.Ą%نyd'AlZE7X5HOt!@cj&,z2]AE)4g NINJV̠CVsIɊw⇃*,G!Q&VHAOsvR䊖ü s$Or"a6 1{y% >_7)s@n줯Oxy{צ`ud9neeCr~-? ˾<~nggC\g$YQ;A?^U/lGcV} pٗN`8WnXOWgNaʭ@>uxddLZd=j@M'}'Zu_:T,R7Yުʆnv=ӿ>eKj[!E&[w@ ]]a\D Er+|s3;B|UL&vy쏪qBS^_R$V:{ͽ!izs\~фp8PQ'w1P"p[F+F!j1F o^r=nD^/FxW/F5xV!kcZ?9NJhxxzVӘ㤥< r._;ӘGIM0 ŹQkTC\s3>Ѻ6>2Yazng>Sv!|T rB)BLw2UnGn-A(,mSQ_2-1tTӘc۰6(͜aQÚD)4*rg4,'\E~F!`4Y46|"?osw݄!'cʤ)ɧĬe"NoWHyF*ݕ6|UuZk+,Wb)q Wy*?tVɁ1FeNvkl1]3#i"o(ص@ď0gy2aFTyːV47 BLl) 1Xa鞶fVg ?xFn֞c.-O@|559rtH+W.K7SS !cs(|ɩBAOPw1XN:֥lX! "ҩ\=zK4DDoirNxm'q>@O隅'yMQÀ10|f)ԫOʅR)GU.AxQv z߼| R-=[8w?ZB z3ynh ^t5*IJym5+y de ˂OqT?5 Ԓ:6`jhHxϠ*4pq %eϸpE&?ÉI=/O;tOc &!&>ջx;1宄f.F^j,Eh Gb'&BL,Wq%:#<,7G! #+]f T*[/N}991CiIF>Vj:c)۬[mw7/ؐfް9b+V#'CxdXa{ f)G>7o_+fv b6ŇnɴXp$Q{\rM +/J(a"+iQۀ16| D+{z#Q[$1 |ye~Tp "5CztPትejoh|0.J6Pa$>8 \Q_~MB WӘ% ,Sbe Z'xɏ2jR2_bi@nr6!ċr|o(qu_BܚPqT]e)]Ww/W-+kg|&ԒNun=a}񛨂;g_ubEgcNOßGE\ӘJy)Su;w".'Iz_uNӘekVtlLս& ڽb:p{OvO4TUBf۵R#Lj XE;$yc)s#Gt s.r'.XB|2i +T^Fx@1@ "7$=K(u.t(w}.Y\rkLA!M\/bLhŘY^yIq<8n8r,\%U!::ne@nF>VFr]K 117#,Ic?j& <d]tp@C=4c<Lp AºQ ]u[ 2fx>/˜aQ% Bd Fdco[][ ]nG@&03F޲p A[8i:c@cE>!>sSwi D>Vfx@ɂ\1٢˃WaF2,Ef(m|ݎԃGgtM0p}SEL{7)w[x~G/d BP&tHoѨڮΚ5"izsj >B+ J&_ !48B5 4O p#ңӞβJ#4<*K"_S!njl;D:olhE})kJ!PVӘEbz10/TЀB1e6,mǥ`Jr,qSv6{)MT{j7_{sb\j I%wE O`a |W!ݺ1 |˄Y< 4]ȺA4Ba_qEk;1ǖ gvc(0bP\9Ry%"]9yn8S5n3:x~ !shTHo\Ħ'!0S'}Әn6uOp pl%!Ѱl,DFgOz:钇g'0ʧAzD |1F5zэ 蒷 K/O }O-v h;-Ҋ# !Z4fHy8q˅hR&]@:\܋).-I6-I!QGf=G40, @jyA4pp[^Q_wt1N1˝Kn4р<|& &8%)஗>ƈXn NMA@H܍.%EpJKQ֙)1I52 '$,#i{JW{sZM#<ۖ$ܐEo!ny<*(QXa._T"%\#Qj}K=" e8I!~!+]ɗؐҰ=E٭]<[+C s,^CXQq!}Z:!X/tL*;(]G`]]ُGi$Wn "(#J({ |G+7}CXOiD Z)ċo(ӗQL!y'Q,?E._Q%Oxam6sX谶GDgaS}*(9(E_'3 Vu%΀ +F<d(!F!LȈ$*Oyԛˮ}=:'3O8F<81j$gWf{:Fo,MiwM8( #6gn'LUڳo}(3M(x. ˙yrˁb>/us}ܱ Q.AIi}~7g>?X5~o~WgQp5i&P&msZd/O'?SKG񢡭oCNB}A*(u~q0 GkM>T`Q0qO<^Ĕ>,Md%rl)Y[$aSxEWc_uzЎgP&CB<m!|`C@(uTTB!#xpn嶰BFle"2Rk"<]ŨNp̹p{892Jpʭ@ݏS}$e-E|.>u~s;U8݉6۫?oӿ> Gb=o/}xy?; _g羲jj@mVʤlؗ:b4_.W4(I.cbh6Z8UCp?\(ShPy< =r1wa)iii >~s=17.CL  b \ ] <$(-zu\=]P6\Y`4&ƝXLB TU BE:XXL!lꛢ=TV-VJ$lU /)E{) 7.R1үIײ_nmُU ?T8g:6N>'_Vh{`\"<&nۆι.(򰑂!N2ieL9-oa<&T3 Tgo>Y_\{31t@Z= ц>#!|RH)ޟT`&0#c%xa`Mڤ ] 7 ,V܄Mx0%]"> 4 f7sɗ~ ]KFB=کPyM,/dy}=2kA0 G~0p@|Q=@1x3~9bLO~Փ s yVSm]5z.-rF,vD9t+xÛӶ^'Gcsovm@}ۺ {< }>FJ̩Kc"l>ػƍ,~ń AI#(bYRDA}E]$Erz;1R^; ,; ),LfWW?jt%׏@h_s Sֿ]Jkߌ1ɮcW`wj~OF!9 ackݽ#Ao04_8xoti`y-F7߻j@Ov=wN,T$ݿ"P*U AIQ|LztCv|AM F <?1_=Kn J1"p*aTb`ARG02v#vFt  , bysnYV}t%O̒uNuD8#2ؤǘ,/͈FII@@n#6zOu4,%ѹ)n oc*Rc̈mv}at3J\:?XS s@bPt8 f8<`G _+V:h$fGq=HtiKmUmꪉ <9 D]5`!Lɔ,0e^@>ʩb̑God#@-Ȗw. WJ9PdlE#-s("љ#E#9a<)lQT=1C[~RWp6 :s1TD@O1,s)fˌ;(Ŗ$&BPBsRȬc%^"kp)S(7uY8PB#P [RyWx{KcAvQHuzN^YDw!hR2!?HN/= aSkwT<= J4Ä8Rۏ[H95_ 3&RgRTz3[I(7Xj9$~&9 ) p X> @۷h$fD3XHR렑9,~uŦG$!s[G}좝3C,zC&} 5y׏ OT13uĢ IqSQ;43'U"'f|ƓnSsЏmp頰TT,_0STHL?#Mf R$@%[ 2/B vH̜n6&sާÕO!+;\߂]DRGǁV3ˋ3\*}GYj/)8% `g@?Ha@VChlƁyTvexE1J,k. 2d2Uzk ʝʬF-ZA*Xc m蠑9߄=4bCh"|a݁#` <3(uPONlex`e+!*Z5SC/+UJ2BJ/P4cJ!f;'DQr'?veyRZ^Ũyl|2ݠLF{i2wUq .Ķ <#ɣA#1s(2zuty7zPY֕~RcB>2VxA\UD?f6}~?i4x(gj 5uf‚cW#9v}SVAKΨyG-{,)1 "_Ns]6(Ͽ歨:F>͎%+=vW;0d4Ըkc;'xj CzLaP)\=$X; "k#Vi n½uRJr^~wL?vJ9k6_vm *쫞.DyЗZ-X i~>n<'.ӛ 4uņƯb+6(zR?_֛/khz.qH@kTxI2n{96`l.~C"B Ri숏Vvia};q h* G_H̜Ǻ6Πb8י=jua+0.}ƙb6r!)ߏj\H |g3ej\8UtOUQO23'p9$#'54ǺǡA#1sb2+ؼzQ'&JW̉;( r0#dqQGzAFbDrW)}:K43Gs' 53 >z5q|1#|#/ǘW2U@q–(]ڼrȊFjԬf5''G=JPcxA@#/2sRo]H*΁QThk@Y|\t~s9Fb8!"J0 Y0n%4 yh 7 Rb[+!t6φ=YeW^j2kE˟𲀧mႿ^ǶH gY&H']f8Ab9-NH,h]i'D9|O{,ؙ^.-}fsa 3-kq< wF4/vr5_Jq[Ķ lörZ&]}zf-@mB??DDxt:h$5ȶ6ʴ= TbnEZg.+pdqxl*ZP1>nP?~h/rMRKr]VGewH̜ Խ`Jǥ.U5Iw9!0մ`ֺABt$&e=CCL343(čacNǓq=m71YYi zHt^1b dLei접9_n~=HXTeRΜEIE:-  8(:o(,`IG\1ծ\wOt8t JQ` wЪ\`dw%T%$L E. 3%ATGrz-D 3Va# -YQ|RA{o֛鵲fo/qmNƑ*s41Vf(M^RK;I1~Mٔ*g,.@C˖O%JXu4#3 zTr2`deN  JybH- >>VLQd$[>:+6+qTveymW-MXd JF ʠ/Ş1}Y 무g™-nC{`f.4C 4<2 xx݋G w7&LW"vq*קY2[b E33 > NbMi|~@]Tn.w3?kz:qQ]6l3ϣö}2&,hEn^/.[!FO^ka Y-=6͆4]o׿N&eS.sjz۬6 `sjR\~:'𝴓¶r.QF)hjgˌ Mi _9(;ϣ`Չ4_ G8` UH I@)#K0d@4nڭU[Mz-tHNQag ~dc!o=^!_Ux*%% y ? y 3Dȫ}%U* im򞶒"ϳ81l%gY<ӄz)[{!]đa^|/NLk8T;Ž5xޗ,)AwOf'udhߤЊ.n?>x405*+\A"'*\!4ƯX wx[ǻA!k45_;ue>Q pVhD3fP&'*g. D'WdoZB$lv:iJdÝ}AyQ'}!4 COQD&LsAOǕ=J-|9*3;7ۋ/km5~ڽ??5\HX9p(ǨE@p#Xݼ\>gi]]7@7Ku\4#eFo^yKFm@\/ylo޽xM.!PJrWqKe^/QNY˝ad%LA' `8-lhCj6= ׾z,o>Vfu,̵I8{C=T.rV̳e0n r TgQy ]2,JV xAʱVJ%s  *3-}|n3~vSx<|l] 2|ZO;2w MG5ȣF ު=z409>!ũ_)\釕o3njZvw-WUjYnmJ;L4g/m/-W#^^S*7\vγ/Bj=]̪pwv{N [|Ϳ]kݸ#ʀ~ͪeot?=sût+ ɨJ ٫0׵Voaf4[ XuWoaWW7+-pU# w.棧MEhz_[o0 8VFhz{S^{i"N[*l=^0jF頒؆4~J:; e(.;߯E~jù1T) r4aK; HYj VJXs,#&WՖՖu;ZMtP4im>]PGޓ-9r#+"W}tֳzЃ׎`8gh4dϬw'dH@>DEHD"/1ԀQPJ/Aä1rKNѻy6?\[&cb ;;ѳܬx;#!Ub:sTuqr]mמ&`[*w1L5 NOmv"/p9kwx330~3'%u|ZyW`ڼUu7خGYdXE&Fr;mGk xYG=ǚuԻԒO@=EN(rk:⬲1\ַLLhq]oEy Ul+/ds"E]0Z$)HJhr~)7:V:4M҅c>$)?Y /1XP/p+sX&Җ=k4 aX^SgZcL. y! #C`Jyn@НVW4l28}: "N-+,t0hh4yg~!*JbCJ[1\ 5ځ>2A4[X%d k\96zdjpZ+ qO6Ibt"<ߢ #L~84džJ *eF&N!}:mfG:Is"Yclײ{Nh7:}z |Ը_i5`ED>`;/\HnS#p[9q Pp|B rz1N+5F&OTr&ؓĖm;>:8~sکܜcO$򁓾F'x,1 xD3(pT< ~^S[)V"MC QivW)S(::QDQ αNʁ&yoOs| R!e\9Ml76YDh>9RVZ2Т62WQ*֞MV}50y eDĤp !APQ`T KmjH4=rbUߩe+;S>/@UNܶ*Z1w՛7\r:䅷Ʌ2b xg+ް7{G]\1a fOЄpsx &_Վg~7穇ץUŐ+nv6}3_X®BF eMjߧewmibKT)!CTU&8V.w^ 7|kV*\.t,̄cȺ$_ݷk&zJ?;҂=X>ϹOsq'KC-cKʋH48$PM9G, qB'qǩ- \gyT5ɇlG^ =1[30]ߓs<~k;[ꀭAP&nw_l ?ض%fCV/Y䑽LaV3K05I%/g\QOc710|(o!ۻ҆dUM3M _r쌟kViZD¨)CAճ@Rj#@ut?&1+9~тjFE󁊛A$=klKY.Ɉ}#)=Z.[|J%ў5Oi؇+~܁~N#|}vklG>Av7t6#oh@lf56O}Cx=Zfr6-V_]{&W(_W?N9JL}í{0J@0214lTSp•L^|jPtF+d<{7߇R>LZ(5Ե_A=thF=-xͥ)Y)-#;st"LNnA#}d7m[Kn 8,dNpnu%A ?#::׼iψwj~WEk>#j14k>>wUJUNe?aLNYss|̩ӾlgD\9W)1{w^fx}I2ˆkSWi+mY`tC8dٜ7XטC_w=6Lnw¦ZkP Wo7uZlme,Χ7ȥ;mO05rm}p@O䊪] {Wp,6qC_MO$tvq*nhsxE͙f9B\_?w͇]suʳ;u}t|b\ a S]=-HIFԗq&m%69s)Dr!N֏s`?nX{f;(jWHoD^ ĉT 1S~?.'j;B#k݋֫<;/9# |cWpcM?4LqofNش0L'67kEYkڂqNn/ZD.e/m٥lo%wGSvtZYn2tv|>/®gw N1!/0B&Y~ڛ[2IoK ]}JL@B T:CPytuP\0&q-҉{]2@>ÃPyaTe r8(&-8Y{dL@$GB %56/PyCjK{$!2D7H01pK @^uۋAb @V3\ȼ+|57:Sy.yAcAiS5m@B f/>|)'"JGsHnj/PyJXmǰvGL5FG.0bQ#_R )#g:|*+%k#o'!J!">ZZrq5SW;@ c:;.9 bR⧗?(L.Kx4܊ 8wuyjD[(;e"?SyᾳO|}X>3tueQ,a(g`5/G@a ɐ?aF^Yň(&&WyZ }UIcB/Pyƪ9i 5F&b`cGـ+PyHm'R%iͩ1!@VJ <0l *kVeSr)qsYHÉ|  j<8-=u Hm l䆐/PyIK{ &i$0Q\~? 1„/PyEҪ݃x,* VO "D+Ns8JQ!% B%rA*vY$fӛ8{+OV%V9'NbCum 3退[ Hkpg-i %* O0Ū:&c(|-D`PHKB~BeuX90W52FQĬ;=\ʧ<0Q'xґ Pq};)xe UDeTfjO`=sa^1yl$l ;yK2t)NNHksv1̑u{3Ϭߓ.aTلcZ3}w]9eYL ;|;qYČV.?#A?V06ĝ~/yC>/:9Jkue\o,xJpǂ} UVQ}R@Q2j-[n˅(e&ֶJETiɯyxzOW'l=jSIl:-5!?ưFc-TX q< t[컀Rm֢x?hj|x*Se8ԚtqnZ7 *Dl|> JD fcϘ@J.M@.[2FK sLעB5;V:)8!R28)i'UZ_‰6tO ⚻pop٭ɚlQsqᑯɭY:)]Z^YE5}MեY|(ruCuSHI*^,Z;.e wWZAxr'+/c_ҩss)ۙ?u,O'/y*5Y9uB Wcx+-ɲ T0dp b8fhK6v]GKbuj>Q #NDrMu"HJQmJe sTLR: z &N5ڛRn6܉8w5 /lWo0򙁕,o(&'Y7jA챹(A$bFaSuqzaWc0 v؁~FHhGn靟-]^]ОnKDQv]%V@ + bق/:>:ĹYYnqnHB!:)T' l\? U!z!1ҵ1RimZ;(ƄZ@&*ebQ]Z\ү #硯}_1=%}h=G:@ag 0-8m" q]љz~@kE%-zXPG4b]ڮMPT},z|sh3h0C>7sˉ <|qUT|0zE`\͓$~;ǷϗUj/7?}dž^vOofߠSz3OU?Oӓw^.U݁t0y/Z7緭X gpnGql^??Ͼri"5kU"'z뺿whoN"h7qFO?\sxϗ<K_~Zf?uzg|꯯n6~:_}R-PN9F)q.=dkm,֊ry:V-Ş{to|{Γ0k.ߗP&.>vg_^"Z/jb{vksk\2^ 6.1ҧmb֚UNxKms^V\7s겉ORXetz|.f]~x w`k[k'A\kٚ5\Mvh}f'*YZ|}a4~{Ey}|-9wPveZ/ O'o7rf'`L7D>\KW HhkXJ֙d~]_{m + |dM-c^2Ʀ9Z|ۆ,E1D6B$Vt/MO$+c?=kaHFGxV勿Ur=V߾:>.bB6WF[rZy ڻwOQsitF*6RAj_SK ImZ*("("("("(QE6`6`6`6`6`6`6`EQEQEQEQ"5٪ܴ*7MrӪܴ*7MrӪ|tVkUnZVUiUnZVUiUn43( tM*d~uK 渚;F{BU+h?}di.tx B6VJ׵$1f@]%vnfl~3Nzi.|i'f,n9d-8=m= q -E:t塚A "Ýs=FP>}YD D5AFz0]`F1]D@eS`hVoJ KYBpJ` %ͦj8zՆ!2.y÷DۂOh"wǡ*V7T7;q. JB9ZDvo F4R7Ls;F58{zlP'.y$脎TЋ @h#cTϼ(ؔu>gIAٚ Sy06FU `3!WXΆz{Y܎w׻B3:ĞdF+`T&QG)1>$#x4R~%-a7ӟ:Qj $)$C$ e - X Ȑ"i~̀AV<9l~R󓮰w5VkTxkr"bB UJk2*+١èm3UL&Hz7:$@$e629*&i)l=u'O (mÝjcBvE*X!nT(/@P1ly&Z膩"t'GeވKXCWTVh:HN7- iDm}7??yä2xB"'\ˢCflx'+(!ə 1eeV=w.sxLf_sZBE%ri7JYxes[\3R.R=hd@PZcw2Qՠ6Pgy/6FGnVֽ/B!PE?{WƑ wy<~[hI_fE6IEU;""7j< sVS@<و!Gnn_ʅ9:C}3[f҉湛׷n,X ;@Ox vn^ւAs,OKRpP! KrJVY2"vAEr-,7d"LNҐaJ"Nxgc`8JsbE*(t$Ȉ0KmԚ3X3%sFT{%@PXإ.J;Řq\KjzW*W_=5w˦ CnΕK, %1Do ,h\Bؐ" 3e "̳GmDxJqojL!,q' Ne,#d EcA9eNAߤa9Z%0?s+ᬪ*:epBh=xB~rO ϓ@Nx/u2 .CtK`HkR2D'R(= Q23"h%TIn|9B/B+ P{ws^p]ϧŵy\x;<"HS"1Db!C$H "G@b8CZqGZqGZqG<GEZq3GZqGZqGZqGZqViőViőV iőViőViőViů"HJҊcP.Ҋ#8 GZqGZkFnk' ;OࠖBq[>^U30Of`"Saѳ* 6>vհ ,ďt@QO4YET-!Y ;MVd#T;%*ÒL2 UDwTy#u;9'*w@BcH) C &(Y(ԝDRBibti|6{ _aRkRȋ ;\&WZMQrKD-;@aqj;{on6Lb`|Z DF'0QM4I"$/Ld -QEy)9$PZ0Ȼfɒn$A a6T82X) Ɂ $e-hC^PpߤniQ{v{y LS2gEL<4ᵁ,SY2g"eWxpa:3 NPkT$,s)w%%8a> 5+H_q'…5M@HN%!eJ^99eW{m/_"+>e_~V/Vh߽M?~|~^̼{hEPP}7)Z~hza7Rp#ނnK׳=jU3cߣ7|avw`]uY7ߺ~ߎjᔿj> .00PG_\$dZ3q7jsCG~|gnw W+7Pc9w~OgZ矋?OjrrAq-Z]_^/j|KΆj~2AQ(Jū5٣ZMۯߕ9+F&/>=_)cnJy3{ocdS=Lߺ׀ڝT/\9^u=t*gjm;{Y  B_KC =~*m Iϴ ʢzXJ XFm5kZFVu,Eys)3 J媚{O*ݍsNuH V׻.{uqf*E?Dm,oF mXkdEm6Jp,قXYQ?ܷzϞ=\Jd2z7嗈Y=pW3C9orȭV!J.TTK"KK`j`!TO*Ɖ ^Dv.G'Lh120D#yiRăW^X q X$)_YI=Ct.h6ؚl_rtK.ZkpFi rLJ{|ݰ *s٨E+6t"XU1cx%wU$ß^Zԯ#N}y]ndz=Xt@Δp["R‰VKTY یi( y&<3E?0}y]xlt5xdķ8d 0"HNˈ~Gbl8,v7xV@at]ج *DJ%W+8gNe8A[ do#ECA)Km+0F 3Y&'Gn871ꅓ;phG%Wbr9@7r_cہy;0ojSry; vv`ہy;0o7?(T".H4H4H4_#b*8z@Aӑ+Re*B#30p74iw 7 $ģٿrϭV-6JE_*R>,FJ֛VN><D) )E/(18-"QIVgs.{ggXݣhD|>8]rb"w1 4b -#Vy`f K+& IxXҚq f&&47nc|k~ (vopo[7E({^}~)WiL֔9=1ҁě9e [)`z]>S* ڵl(]{J״Eou@m>5ʃ8<ˉQG3Ί^J%Q3 7'f&lVPoiB,p" RiTL͔ '  %f6L|26i皐 ɘ5$6+MyԂ"H>`DRIQ͓ ,|M%JJJ1%%BQJ4/k]ijyxa1=,bzXLa1W1=,bzXLa1=,bzXLaa1=,ÞkljGYȯa1=,bzM8L kYnߚOMY,Q!DT]d ܮ!5Wlf'0D_}^V,m(z_ts4gڴA/KsK)4~C.0l3+xyx`GwvѬ 7+xsr~b Jɳw^4ؓ{<7ys}^J 9.|GѥQ|hKr;]O^xCqz;'tI@` [MfazRjoA`Ql*3a?pa .qMkIA֜|nTF21Z~{U\͊v *R^66Ш sQ_Oܰc }x u,v„4l{4{7J̤Ћ+O^FlRPzX2/f: ZG5\̬zZ4b6VِA~55-bbO9^U4-~)s`8,U\l0H{z=bZ̆<]y,sJUdZ deQ DEAc5)t:Ќv7b*%S X,~&H5l0b ~߉ں_Ō)B{&yԖBɬbN OD1eb <K < 8~˳sƭKkMߓ]`xgUdK\8 8}>LNj͘ ;H#Y _㸦g6 ^\g3d󌓰?IJIĆR 2VJTZqZ` ݵgg*/hѱHK3"( ΕTNHgu*h7V4ﭽʀ BTN*W&qO[B,c%HJ{)pJs*7jn|Vt9FR= m.M.&m[ʦlB˙dRJRb. ,3l6:î/Zpm6&~; OQn6EO0]$ZE li$YhG</7*gdW*0e |JԈB ˂Yf9%mb YcS/>,uĖ &GVlq.(K)$VZLF㉱i+b|h k$@"ֿ9`ziT6j F2Y(%4@#Mj5°vIKy5{R [p!?{Ƒ@86]n7A\}FO?diPgCrDP< Ha?Bhdz Se~+uUDʨtVr5f∢8Uh2DmU#b 0J S`"Q!JQ‚ 6q"nvxneEC輌52‰g-,SB -6!a 0 K PĻoy,"#cX$"-:Aoj5؞p߿,2?!.>D"HkGd}]'(UgkJ$N!nD\YE)Sj!V"hE4ꑖdxo99o 3j@2#$@ά7큷a@l4N0#iHe"몑YtMμiAܨ%NZTSճV!ZdxP,Lx4l 426uC%*ZH\J4iF3E&(+D V:_uJeS)TE9c0yQZICk9HFR `Б0h=@C`g }_ʟΖaӕ  ۂļZ{2Wpn$:Fu5@RbN%8c3< ~+pb<ŔB1UKe pfq) 3(ڡ,?ء72/ҧDp}pۄ|P`w]?U){\{Af^e<_6O~d-g*/FAXd4-_5^=7;,(,牁%%1._~*HBSR%L \ D9D0i Hݹ ~*og[2ʁ$&Ky2%RI=W q%n.6 yB~Ge'׼yr<B4DTBi9)U5{v`/!>ftY&Z ڛhP^uXf %tWȍW]ے~G RJ4cGm2RfAvX*@~'ؗ/qxHiMGDPSl/C`|Wi cxQA׋?Kk (hb&Ub氺L1:s:<iP^hdSVL|xW/X@YsS⿭79AwOKV 8}hlnX6 3m;tKzԲ~Vv<")OLvsoE|󭛤_b]yى#4^6X[uWܢ-++Rī_TSaҶ3dm(Y>ӊY*YC9l\H!?>g>a=ΌܽJ4}b ltGY;-|\mũ٨:F3Bʐd7z6D;^;I#jol/i˪'%N-@ L#r]̐0ɺ $$-HZ `|;+X^N#> >-ϒC,`Q@x־}(ʗGM[a%&mʼn)AS bJ 1%&dw#<>9fq$Mz/HR_8:\>!Edt8u5J k1&s$qHMSjgBp$n{[r*"$^t*?tdGꝢxs>DD'oR5M``d9t>V)faqvCwbt' pϗl_2GDẒ̌ew:DD@Tөd fᚍ_oj~PuчDOk |a{OC1"?x0z1|,Vs;7P֝6vx)xix*+[liQHa(Z&-y!f]2vװ`P OFJq0cKҁcu,ʸ)ֈM4c-NmVkNOF"] t3.Rj{p!NĥAs.I)Y:7VI% @Z>깔n&ڦ2!x* k M;Ė[] b.Kyսh~L?Q\9(4姳rbN\?q(ɛE۱gFoI_uO"q/#|oa FPs wz&b1-:OGȰ9?v]1(eߣ&B6$!6!@+Pچ,Ie{biDGbɈ{ȖbM)Zy I'OhٰY9X)ߙRc6 #o#?J3d\" 3:ӚH0&RkNsTSH1آ02쁯vTCec;%/ʳ1!y㵎UNF賨?}ѝ,!)b>/AC?8/a?:q+Fz,)'C/1POtReb~JNnoV ڶ&O#ϤP~uQD$t1%M65:Mz,\fJ1N;,-9jӌ;j Q/'B~E³" >zQ 0/`%8} nж~oAJEF4w\멻Ų@n~^_pP~On#\QKv)YYRoP"ϲeKOo?OÖ/Q5q/#U˰=Ce /A5f+E>\5?^`ϛbƵ|jyɫe3K\^!§b$QfVG5OJlD]|EK^Օ2x g̣@9yTL,τqI9cg$iYq?5h58s{{5\s's ϫ^ C6ٗ~FTrԥp"UE( [se"q>i\+sy8ؘnZ8js&W~xسB (S;J ^`-6Gc]&C7߯/V˦,nlE~,kzsoVAh>e=xeݫ̛%xi WqUDaOr|9jut3}A򌫶Qa}N%(iy!ݭ+ I,;yOi D+{1z֣@ z22ERY!@~)}@Js3e: vΠՖ'T! `cVPqC(~RFYk{e]_>Mo'_>fu`E0]qaR7+u_s*5$~" IaE@i ɔ4Q /,!'3m M8-#I( y;bĨBc3H5#TEц_{8 ^r Ν| <.XN s ߻x1fjo"$0dRx63\[-$$\ B%]-(wf<x6+X Nf#Cv&!0f_baBk/so1 4؟fIo<ԿGO5Tg.TR)Ka>tFk/oy8ܛ6"O 2~[y!ՆϚ+`tfϚ x]7?X}<< Wwz9g>! `+J") 3!+-5? Qת8R#}<ܳV6ZgyB6O&w:R9y:xbV!2[ 57<wlF9rdY~iӤkq<:R{/5_Vz/6q{#'ko]4Dڈ%[:8gQtjge\,?Qgɕ{/ȞϚ+g3w:!| W"ڃo/_8 H0dMdvG2!SE*2kmF_v6m00f{N.|H^IG>[$%% xKl6Y*VX#I/P!`V0/2.DB9Y]IXBg TL1S A|@N@ׂ=r`i: ;b2eɘV!!3uLEмd$&A:łKs`&9bBZ-eI˒֧ '`3RvzU'qy!8~}gdNG̟e19%Kȳ:ΞQjC_D2x^bh8dL{ogx?ߥr+trE['~[<LajÆr:/ ÜiqubGESuZú- 7K <~uoL5_ɱi\,%\aX;je~+m=/KU& N|t*blƲk4VEIɃ@ q9ǸF,Ո*˥4 Wso#BT0J!jniݩ _Fk}m+z S[^ 5kB-vIlPz...~_d}Ֆ`O +0WL-rzi"h__yJ4X;0Sq!:b ""qgGJc7L߸ȮZgs/c<8lmcu&`j.L]7"ZgNEw\ZtVʷܢbݢ+V9[Ҁs,6Ry sZD)Z2li%%Q^F]u&:J]J\i 5QmGs Ƽ֑aY yX+\$J@"I}`uF;j߭:ae]U!S`H?"+6o{&C/i=x>{<8bh%sNC DUT٨^B \t\IzγU*l]7;]wz7ȭC]^? ~]Lv^p~u5Ժ[o9tfGn=ԝ]vy|xws=/y{ojUujEɀ]@K-'(m M~iOBQu}}?1jctLOJD0T+TCȿ)/¸>mi#h:?%%|~\7b渼5XI/"??vd*|Uhy)/FϽa|agXt}'Gô=^'?Ke_Fiw5ȏa6:жk ~oyI}nQ&1}S^{˦ Q5&ޭoI/G=/A}>1rO˰^ זXgLҗMi:x.eX"0{wSLoH<59ο A>wwMF~O9Š/)-d]=w)1~׸/cI/Hl?|:W>96GKviO Z և~8L^||`;5f}r9}έ|u/S%;&|=l sz?sV3S Fi)T4`J(-q)YQ8>yt%#FWEnj囀(p{1 N0Ȣ \8g#H#ҬJ'9ya`"ZA6!M$i2#$ ~{W])cB(}20$KP| V[Ҧ D& mYX['7DJF9Nr(UZY$w~Ipsɸޢ`UFe|eM[oB`7k4NEQd K#VYN9vS93(AF !`RIAHd % F8G"W †A;Ά< >d sy!cpxA ɉL:½Hi .Cv3v,]cu7z=oRᰎ6[14sk $BŌ`peS3S͆O~ֱ"ʙ50(ԂUhxF/@u ZjȊqTz Ak7-NDId JmdfV{63@biWF|݉-a1,X3y񎅏ϡdwG'Qe6)K$# FIشLd 5Fĉѝx,1HGgXKAoicufYMʪVD*>ˢ?SYtUٮfCWA7A;^\C 7@f?0W#8upf/ɖa9V&7=-e m2@JF sV:AfxF{5/0DN.`k M%}b+t@Hm[ρRz\Tknlh>yN{@)B% ZBtVN^[i0HkM{3@uL07`7۫T2㺼U6)np0%sH[ &'1be/Ħjeӵ:)Y&&VQRT3@cĜE NVFV{WuњjߓVtR{Z/W?K!۲Mg2 XvrOv&0!HέRy;)4,mu`rؔ!V];JfRNFseSrEKJd=! Ⱥ3-uQv)_ZsD$ahѼI>gnsˣΒA2`f|>\󬳄3 V',xyucja}w.ԃԮ=M`eOvL764j;"-v5&c]*[w PmR8q>1o!L c9eDw^cwnsM33d{j!5(<-c-doh}wtV%Mf`&jZuǡE>YAtadWC%OP^5F|/E'ܖ;iF|y 包gXH`}`slNi9 6fRhINGeNG919#tĜVs:b,791#欧,91#f91#tĜs:bNG91#tĜs:bNG91Gr:bNG91#t|s\jl0{7bjFގ,ՙQ0xj"TLRb>QhEs5|7{܇r:Pr.`9V&7xؼ|yAWaV~L{T-&Liì~fu䒌@ \o6gJ_,!TonћJھh x~Ӵhnο-j`xfv;Mk\m҃ h[)ӑ}~VXf1ԛhSO~۫`'Y[FH(򥖜N)bxЄ~jD9| !3˽+f &uD%ԖhQjcK#ɃVV2W}D.V7 Œ9K˭=ʼnuXIQBpeט%W@)_bfT*(U2xi2% I1f>󨬱^+l:xfˑY|䫻ɠ?{ƭ忊$|@M{E"@ F,4we%rp8$<~CB:HbAzD V&%eک p3`OD{TK{eHV s02 &:Iag"sA -q=" Ro9覯oyg|hEHJe0:v^qӏsc;dWoߤ#9P ^ӥVG)k%IV eLX-9 v $Y&.A,{uH9J̫<8x`a#g38"-Xtr 81!i4/l:dY\ ʈ,-& z(4 @,,s83o#;BoA3Mm46yZl6pP윕ߞ7ao h"r6n1nġݻ RBܵKŤ %Gyu ^깧uP,Y%.mLͤ}&̈́4I)wb-CjuzEև7ۺvEa~ Zw"Rq־Н}35~dwDAJ6N }d 5h]e{FZGBZj>>&B\hp +K%g R1W)nsQJXs|R } ;UF>0.ozm'Ydc=po0?v~ ZZk@@#!2͙@D/, Bф]EfJ B[}J\pJpAJ S,&DxEհClYqBȯ8{&J<-#icEx_WdO/p\x؊2yGw6N()wLGXZxEQxIa :,TDh.cHy$:}Pa݀ CIei"H’,Ȩ,['JW^epJ˹%H! Xe6ˏ5AiͺzјY!h=VRh5̂kd#X0Y PCeؙB%La$NZ,<B^q\h4S"s\zo,gރH2*wL2T;CnAiC&)E0b)%$9!86<0\Z/sۣ>(nt5*->^~V/oޤ~닗?W߾}E++`Rǿv_n&5=K/ys_xb;boHTqxݪqwu˯ݾ}|,{hpo|?ūv7çH._^ =K =Y{zo <3FS:Gh[}Hh=2p|yf.VIgٶj[z>=DәcD]'@luz}tGӚ{ m1'5kvuurpr8'BU]|l_./_7uy_˧<ױV@*P`B4w?Sһ&_噮woa嬵[#+'3!T2LH1R:j[U>=F#(ݵ5$tIb0Ay!k!0{dzV1xR#UC+sh&lkgT?O?睳v 0RL~!䆂f .TKp98YTJɢZըç+>o(JDRx"s4Ż缪rSo^n:mxg,R|TZ孰)_\YqR9HI'<8ɬZ9F!/z %< :NZGЋc,ǔHk2ZD>E g3]I+ʻ>sHNidTv Vd8Xb2!ę-"}q~6= s:Y&I%H|BLJ CHyKAPdhp B_tՁfڒ=\-!G,@g'RYGn}X,!G3_=lϝǒZ$"H!Izt!2#%b<6Z F{{GcGi\&^ΰ;HP9wOc%BtlT( 7 pIzT@VʳPD2 uG>#DtF)Zs!('@nHgF)!Ԡ,B-Lu-2 īI(|g6ÑJj/Wg@@"SNϸAx3Sw!)VXz_oˏT.Yd=_Yr_ :D?@T2W[w4,/\QĻ*q-8ﬗfJ34t9%m, $h|dR֫2{l ! y]⺥{C:$tB_ϬU5Hhǽ`Oe n=fnujWO{.ߴrwjS5ʋqnHƝv[H ߪ$_z֗÷刮W/-au7cr 9WB,9_Ea j=ًf+ߐCW%XB)yKxCltqhN}pȥA%L|o(HoB)+@L:v!aef5-ȎB$+l%4$HK"&Z Y(jK HD4A$ RHD ̄$46R Z$9jO= GK}p!kb?+dWJ9^!Қ TfQ-&8٨ D3#>< Mun<Gͼ2>lxR@G-Iʤ5KrrgY6K{.p⢆̄3 7r1K"/C3FE@WV# ]U4[TD]X4uBe嘉$RB/|df,0ƑI@1m%#Ns`t\ e$1:qEs XFy FR'ǒ#dIȴQ\.]4ܢݔtį.Zg]M .Fg;}4}X>i1JЋ.\:TJS>ߺZOSw-x*9L>~GJ{IX)yB: `d:bP\>*qY@$ey'=٧d sҪ2,qa*$+x*N9R WD[t89 |y!OdcSgT톮Gx{7uj+ttmAڠoSeUbXu- 0s~QZ/arv:Pk֯f!e5]wz6Ȇآ癖o|u}t<Nxh~3Y]\ i?nmNF=ORNY)??~~ީG,[ %( U:%2Z'U(48DfVӥ>uY_iG;z5Β[:&,ƺ"͌ V]۷4{|#uE9@$W@'<$уw9$2$97b p9?қ 7AI1s&pC& Kּ1@$m@qt 2&x;im>ߧ|gԎF9c̡lXD@6frVަhs|&.l>p-k.])tE|ջ| *>tڃ۳ kG!)\ƢQhee\*ےLj `Y~Bh2加,dbbhDtm*2b֑!!d}#;.3y11cԤ.eJH!L2`e9a E@pZ9igN;$4)1u.9.{'O\7’딳P dҕWTA[#mFm:]t|eFt.c,oуBcd'ѓ o]Bfm}QqάCkHzm:?`1 (Rʅ1愰")ǹ Z单GoX!/Ipk/,͞@;0"ikG#y$yn>VKd$۔%GT7]_+.Q;rCOV.R~žZ/xB3t/7_(O*# 0bֈUJ4Yxah)Ts':Cnf܇Cp -7q>8mfK&YIЎx\REqHrLJRX(ˀ7JKd0K"F9X#;Soze M=9ObDKcv!J.#AjRB#lDQbT Ng,)xp9K©A8IQE)o,gaP"bfKVW(8&qz{DѺ0׳J/׶:CDfOWѺx7kE$[M<5qmpeʴ oI2pJsRȓq br+"طq_r6ɩ<|0d=d 4k.%Z0>* g4ZlP6 [KH$'|p%vx/˹:ZW+O5ö;wTzmUba\a\Z.K~(VPR>r))\:@^'mmT%5@bk˴҄'V"~u-ޖ:*\aӏ^{YQhg\>HH4Rߒ}.5wdzIR!PSNJhQKF)gqtU¯;~rn4 Gj0xT{XXT}u }Ign:l]s}K¹ԛ]#UZo77/`)K&܉c_e_o_ڨ[@yQ,j8N \?EK +r]D|k3$<Ǐ9nTɼP,+QZ=s+ᬪ jԥLݹl٧(\]~.ľy'ZĮKF`0>t_˓mǴô4'Z1-<2vb@%#jZ"_*NF!r,c*!|4B&Qs:ໃ&pxCj{1m#X)$N)'Ý","joL[y H S–kv0sDNSCןA~f\^j 9X59TkH<'^eɌbLZ'ǎ)~X_ƣէ񷏾9I 2D!9W3W) r 9qɑ *aFrhL*$ H"q;I(!H~;hs⧳#ED |Ҿss {M4^Ε& KĒ[|= 'P,q Qx2l B'߁xB ije|9jeYa+V~7DZo 7nxbpm>H˓nDJ@d1Vx.jRWP9#lםWҎ՗Ɲxb x4CRsP<)kS\PQ='SCQ 1Pcȵw}罥BD<6@WrgW3UΞf8x}7k/˸EX37{\pG0,Yjf4,G7\Nn,5qco&a7/ǓÔrjֺ`~qSy/MH?Rֻ<Me˭͏d &/o;F~;s g} 魫$Z 2֢Dov|/\F$'~}YO`XKC<[sw=&MJ!Tvz$'AyFaKe;TY먋\8gM>R 0Z3&j q%D|9sAvqj1ΝU-t|fS(;Qf4#.l7Wfr'S*`9b1V)=A]&X#3]PrVj\;ӣ]zK8Kݫox}㯻xF^x}Vx&62g<~y99y&B&% N4(.G> &(n孵ڲqhˈU*Ee2H!:͸ ^kbtch TdLgrw+9:ZE5YU)et !wP8e!Yв&%t *Q*K9(1+Cy/6Y.N2:f-{Tn0}'N:0d3β^J^ H?,'&&lV%6%tn}AKRi GR)H\G` bA\ ·$Q"{?:u{A@&)Z#j Y&4w:D ,vH!H;kibQǷ`w q]L-.̅?%/V?Vع ;gՕuV9(Nʰ(Li{\~wo%ayvx;`dGmZHL!IO\3a4`@1DR1MJSP޻Dy05u1RHB qs. IqA[%A޴,֝= r~BQ~}b֋ܓ>T8OFch Ɯ/u-SQ\嵬wq}҈XJpΨ%28Ɓgf'@0L>1( Z 4!%/:`p;k u WnDQ$0]j5T8єjDDt@{rwRCX  UG;  `G#vi .q&Y) O3)F^e*ITP(҉Qu?Դ(Z' 8A 6QɃDbHZ9%8a>t@* G!NHFwH ɩ( c'5V$HX)ĤTF&y/^6yz(o_?o_~o/)3?˷?^@(@} p-~hŗvߟ+7bqqxJ:o/ٻ^3kSuse^vǼv.4݅Slͺ^$/+e\z%!{OYU+#xg?zÿQi}??\͛釛*Ղ[9;ثu?=_S5+'z5wy7[|;_zO&qI 17֠RXuvWo>GO￿7K 2}5a s]M/{׿~ٳ7\"b-q8{yogO7\~0+po=o/xIfw7l7V/C}eQ=@}5z\8pq1ɝ|O6A)F7gy;ZתZ+zS[3lkȫkʒ-XӖkԋB$QK 8$ F-Lyugx6?X8ERVRJLHV.OO>H 43R|SSZxHe"0x4>Pr?]I#tG,89 rdt|><ߧ-+jI(Krzǖ_Y}b`1f)&%r4>eRŐa7Z>\ЗCuh,ql%@5&UD$P,9 TJūB1IiG|]X5WlQըV,ɒn  alsL%gbHhMpfiۖ_zVh! :6C[C͖P^j?D7w(z~U|le U' d'\=4^ݺg%˗ wr =1RH]@ GD**FN袞݉ynՉCXvqc>{*Mxg5͖5jOyr駙Mǟ]}~l{itusz[*jў!6Kvt7f'fkr>\[QbN&oOj?u@Sо{S:Uщ6xgLPgAʡÂY `8^?wtC v//q+S@ kʄB3|d8ec)d@J<9 4؈ڞR" 9Lch1k}^)2x)]4!X%? OZmgC)的ؾ\yO`m!Cd:n\Ɖi%^"`}JB cYGJkZD]yƑ3:Xff1ۓAS9BDCL`%y 7 "HFRl*gAPǕ>gP?9 (0;ǛTOZ|g9k-Unk> ޔI@+R9':RG$ CJ t}ʒ%NQHT!J\~Y0(`-8I3j5Ztw5I; Mi@YyAXidkU)/KH`tJEK;vPMQ67R$V\'?8%f62{,&i)}l dj1Lyv[m/G?_3y7o`MAɩO`>5~eJty`}I@$( =t4㐗Jm,W ~WJX/&K̞7xa$jw}|_g4l=yXތl(o ?@A=PoaӋxO?gO]n6/,^''LF 3 ~;.M̢gd'-AZ^,Cx}]Kxߓi4W m6}oD)&%Ō;oOmZ;[8  Xx\2ԃ"I$)]EKl`@D3؅J ft2pw>\_<!Wnᐁ~Kvtݿ%w0w@?@}E*nӓb95ȏxN"tvu`dQ NE4u 5@ BG&2PމN/0.[srAHd1bLE8#TF9hQȺ R]Vh5&Ouڳg5E,~r|_Ե W.g&x);mEMB3Iw%(2 V[j?Xn.FKTS,YTKԱF+IkGo#)cFD e}gXX]3 ڎ[#+r'mOpqLdR@%uf`ˆtxeQh\&@* z'6y~;7r߂C$l?F+H ,E):D_ӊO[ ՈT-dw+Hjh|ʹ&Le3R٘iѺ;LXN[h* / rOFhNj.8=>ꮮ/.Bm ^>^5`s{1bgc.'>ci q d5k-h$PL\Ęr(:9Uު*YbPoW=dV}y>Wq5t/6[ |!ir| fA r3QʍtP"LJa%Kʰ(ieMP,%Q(MCrtbdY@3=.z<3[θIP+_ Cnknjts?έLy<B2pY#WA(^Y(^aIĄz0;ՔnԳLjk|7oV?cm Wezhu5}w;&-4*eEݴbB99]}a#i53Gkv43' 7_tt73[m3g Y/GQ=޼R}p1^kVx&8\ǫ} 'x6ﴅJ ƺn:K׌rsZ./!Px8q؟6 4ry}wlZvo.1 {oi˛6MG:~S?^JwV,txWj1X: Nk;v>UJ"7)#=|仜5lVg=l7t]ou]4:Bo&-N3TK3E~1b L3A4i Z{Q^;^섐ed]4$7`8k_XPv,G9oΠuIC)Qo;Jwo0"yI!L\qhњj _f֖rMмkǿnnDFlz『@0t:zPW+6GЬ_;kAEBgehU& hGGق\kgL)FR$,)AJ^Jr8Flr:zuE 3D1e?/q4ciNx!,/zYp*Q&!:W-H] MwވC3sr҄~ieybx9+ȿa-{GڢG.&b ::(; :PpaZؾ7;K;^;w:cTq.{dR_Dѕ^KCsٚP`DsNB^d86%MyWM.*6Ζͫ0K*Gw'cgWàYqSzixzܓҩi9ؤiVKϬ\3^zI\lik6-=nBB>=oQo}߼nWK͟6vӕ(_yaz~^sKg|ϳG韈.B|=ZWWpձĚ۟/]MGmj*tw'T&LRQgϴCu=7?o֮o#zLZz}M=SކDh2zϾ; D NFoJIZn? ]H ծk;Lso&Uy )uYm:FGđ>u|KS"mRZ_{ |YƂ Sh A!ZJs^=w S/Nف2#>猆f 稛f3A~$ӳ ]8,2Xl|~SߨUlW">'WYz};wa4pף3~=/49-P{5~afJ"@xi#Ĝp1C|eeVcdx߶Jxe()ȰYK9(!8FޭI9Ժv6hU2f2Hd秕Գ]Tʢ+)X4LEKŒ KGŞuR l-4@(,s*7H3ڮ?m}ꯧta斉*cŲW WW_.w䐔1*^֑h!9gP.K.+$EdXϠcaJYUɲe!BL$Ac M~~ " 261Rj6/IR)Hf+HpVUW; i OI;i ۃP4,9%;RF_%ߞ?5Xp1|E] :TdS@9&{|yowMQ<>Yz'2ϑZ,0TLv P$$H1 #"@7K1FqNR"ipL8=A 0jmMY,ɋ-mg˼>?~ݱUy%ulZ/W9{_dCzVr_>\r6b)XU¤`2:& & $YH(GcQP$3ZcH_BGW_<|Iq<âm"Y9rV(!@A<:)KD,jS%MŎbsϕ C1^Q2hZXaLPP(ZhkmH@_6ݑ "9`o>:69~3$ER8$"%Q{^ǯ$Ds24d SI2 RB#zH0Eubg᭠AI"XR1\-QnJX ̅##HaOFwr vɪ(A@℣gMRQґL#艹5Nx1k}(ʜ觢ӫos1ۋ}}OxśxYKߦ}7p-iUhei]A8I y wpqsfEȎOX=›i\9a0?A ^.V8oϺrIy0.mܶ$duwXʭ2N~wC1btUZWKPR-ض·YO}>o3-~6J_9Q>jvN΋pxߞs6h *Qw*Jx뛼ddz^˫; c\/^LOǢŦzJ_ګ \nH;뵗k?|[7KoՆYf|+t>;[澬:aQ7(hwUJI*a+ӬB,uT Pt+guXou+R:lJo׉FZ'W אkʐ{pd9+~?;;Gq绚 Tڇs+WH7P}gN-Y+^+mV_NAGl(C5ag>&ϜT#*ƉN6+`>h12ʊ}PB(%8P6F[uye$Sq;ERFRJ QA$#=77' C ! u. cHt҄Y^ͷv.@:DHԨ)$O1LJq1`ɰljlܘ5X\bX$HNFBz!TR). AIgJvqOמuFhk] ;CUڐ eRsuEK \F W:XIF#}L8/U]{DGoryHCNc.%eh[ύE@t: *O5sN_&<3q,۴O\/Qe׏*bl׼W\sY~AqbzZo$Wߝ˖^wOޒE1T󄶞Z-ùwN)9DB ɵd)W[@o?@OQ4U@Ҏ0Ze)B%$PZqZwO\FAvUb<=],W4@>B3T qRyumV(@r`DTi\胲R y4.34PNX9۶wN(CɅ[q~$ڔm ݣɸq\J<+ŵF "W҄% 5vF; zQR.[[YkigwZEo! АȔ'zeg19Ғ'g2#`FmC؂GFm$3;9BYGbDCc.@p-#AhRB#aMDuZA-=IG;1*ȥAX!'E(ΒeA!p1'MF: VAm&!`7\AG!2q2,cINV ] DV` myDo9@}.q|Ȧ'[fq&Rd 8glQå%Z0\T84ZlSL B 7#L٣=e{QȦܡZ7ìX./փ#fHPz2H8)hnl'Qšc?p;~r|6b>‹zpM-/ߕEV NBi`VFÄ8gB3{=2Y(DHE OuQ[Fo𥻫]B֝RjjJjO fg Eǰߍc8Ue./?  i) T OL3d.ʜpTOS-%NjZcY6*eĥVj(TBXC#w&SR$@BƀO8km%?Yc;9t=lk4hX|(Ho4m(?B@=$3J J=n<m͸2"UHW)`KD5xSG6:zIC[zdK6)h8K",QQ;HF3mQp1҈eJx0톷̛spکqr|ʝĢ˔-dKJO/iOW*#::FۃͪH(?W?)j7\BDWO-}%"sze"ޖtuyry%aSYn;ݪ/\e tZ녪Ra^R{w]I< i«gLRpT_%3Z^:ph,==T(QL&JQĚX CK)>RfZ$|BD 䌖P 1LIl""c=[j2q}^r,nEmC0ܫPD85J× ӟ-ȸ:seG%bH@s( N\GFDᴶiY'wdVw8Me-95 Q<8bgh>|BK7ѣ}_g ΋7uM K雊i4_}?b ;yy2{ּ~hr2ʵ)vVdD_ Gpk|֢(~>e0?T,34tRhs},3.jqW}&FW*h u2Q$q2Tm@R ht1kYaHJ jTrWdJǵtRnqҲfEJe/_QR) 7T=\ b⴨zXw5EF;7&<'M]M=dS~L'Mf8+t|2[q0S2?aBvP./V tѩ7BE?53,W7NIhG0~-yq2acZ&%QF~L+MQl!(_X\+V,AJ^6 Y+W4u'(h.B~[(+zIKgUcozdE}ؤyVF X>b.r+b`}NtoA9%yOwWT%O!7cmNBk}94yY*ߠ߿zTjKȳ}Q,rw{3):5ܧ+rڔ-՗!WoڕQ$d=)#FU<0m|jM;Y|pŬ_ bxz9DXdy Cp<3uo#F3=F҇[ybWs4gFG2&2p&Rr])Q!y٦\bDs SkҤ t[s٘4ju8{:CS-BcmpyZ=e06xԈ:vq7Ǐn!GL7%޳oO6faN6mdp8#s3Վ`xw%1s^E2zEhHExl3Q8p8;Q8iYb=a&rgsD4DKs#X{ tO5ަg]{2&=!5t:E{H19 ? <RzJ9f i֣O⸎Y'ِfCZBv$ ݟ+ Wxbb'& S DHkLj%F#"TY%J`QF*ɷֿV6B̋|,v03cS _DCB' 70e_ΚvK(&6F \<D,8eTNєhK .VwV C)muKe6! *ZŒVsF 'Y0&,(|t2DHh*ʃi!$Ƨh/{=Yshf5jL&q<4ʂE>wV*H`49"ZdZ}4 :.?VVbo修̷:琘uU^ędAj RO1(.#qNs"OY /?#A] cw9o 0B?-iR!)?r~3$EIH#GAlG=~U]]{jRib5B5Wܹ22͔S.)mޛ9K B[Fڧad)cIM H mޠR %@L`6Z\6t|P{Sc~S@Zm;'qrϣ% H2 7 <vYf4D񬽕` #Sg;:+C_7?oկ{GM{4i:ӕkalO%B(;ko 6MWQTEemN.z wW%Vx "s4ŻbS7j7ABp__H-WM De%njL UIKNxpYtsBȲ#둳,`xuxLHb^H(E1 1l8!FO8x/;N}Nid%t.HZ}a9hșH&3@.&;&)EtB%aDQ! $>!P91$0h0&3Z>\Я@3ymI r>\-t G,`b f|։[צˏ%G7_k4cHb;#2хLO[ȌT"Ldu*D0۞ǎJ8|qOX;?נt3w>; .J.m}D/~ik[d^QHK ~3+Ef5n`Z7]٬B:²x>8ړ8*Y'^Egj kYf Vҿvϋv>9(զ[\lL-7#3 2VJ ) J@ڜGS.BJtu++nF)76u|syy;nNŮ:aKuŦjvKs-{Y'%?ymZ 6g/8K`A\ f ZԺi]os4͖ hpj6R˚ZsjݽۖFϗw^g;vz(6xzw(vZy[:N߷=f}K4_45Ra}ԦK [m"4Bo K$YzøF8+w<}<#|>]C>_H8=d :k%9#cpLv']=Xm Geb@٤ $Iόd"F։hCS]rTgJgYr"uR!@9$f"Ocu gG}|'ON4|7Lfu+tj۬/lC+d=4'ͼ2C6UI&_,wDʣ!j1J ':sZ:t<ד@r@2DAZ!tq S|*v^=o_>{[Ym֔lWYRpAR7U.{y|uii܋U1MtV~[MƳ)ADZ.Oz&t ZI6g=vr~ظ\χoi-L`:0M BkS=A8ϸ+[3 Fqk%%Vqa/fXnN%d4("I|8(l?-!5cZu\zV(s<5y"sտ.j\rTR#$dcL:̬* *` , 0Ӯ:vu9!.@'k^EQDQ+(NZcAg΢vDI;r H;=OQw7wbzTW:v ؜R%ѓ|%%PڦH䬼ds|&OEܵ#w] 2]Cj5<#6p88Lg hAT}!)\FQ JUFe*gr,,t!Ⱜ1AOؤL"%Zb,A3ҵXJ1R։Ehn8zy Θ*j%r0RH!B[c.,5(Quc\=s}rڣ{`oo= #pۙ{d@A&y;b66Q%Mͫq5Ɵ ~|Vvj|5xS\h:sGhN 5TJmzg61þ% rMy.Z2J|iugDd1` ⴼ'M[.Y5eDBͷ~ĠFTˣyXv+^5G/}Xd\cWcjwhJzAm.VoEmg"P'$&D%s~/UUUd:: ]]v.vf.Yc(j%YժktBy}&;&&_˝(2Vޙk9E=@8l7iRYQ;嬁S u2dNZ׿[;>{t7{rx1dG+M p!mR֐M:Cb6h\qs0PnKcN+P9m*(o]_Ul8;! =_F`{Xu>L/\QQwK<0܅hPkr!)E ]KBm sng[{zsK601dYMlF蘌 cge= @вjgٗZ8F ApuDV @ki0&ZzgD 7 g gG>k'f Fh9jt.2!U &P%X-RP6 0p[R"|Gݳ@,AN?0'넉q>p8(L{EVq8Oڀ pP3gN/X[^R+Ȥ(2n8'IΡPHAB2:l).ΏFWW4.}îBx'.:e:X&TpNsV@TQA(Hs[w%yX3ٗţ̷{d ˉm-JJxdȕq9)X29TD٣;Xlc0_bZkԤn,3y QBU }<ģ qVduSvsRvUG~]VIZZQj!OL1zBGB2y п"!d$Lr:pPΗz6A+K,Og"Uԥ>3U3jW4G H~s|ַ{hqcg]tvMMɼIk5n - jvTmPNTwp0BHX]u:ǀ<eR5!LwepI%$Ιke7Z:VqF{hjN8w}j|Vmfy^_y~,Hol(o Czqqϗ;q˵JWi)򦺄q1HԈJ察HٝIc{ȧϦC2ց8hVxeJO&$m8E&CLY0d0BW"9jHId$'̈́#&j FdKzZьG_l^7 I^$͔ \C_tRjR޵qc"䟶wjlo۽-P, U+#Np4%Y/#KN"i8yy:ⶎy }xw&lNu&Gc͇/\7][0PrM-p+Th^(pΜJ(bxC4P l΂KrSEm2sB)bjV#q.pb IJ6 HJ 5 f*UMxb-G<(9 Rk\bKT]egsiŶԼBTGc;q$EKWy^mx_kUc [HRt>J8m++Wr\2J9f =~jƎBU=q XN>4#{@ƤQZfµVMl*۾U}lV6Tk)')5mCOb9~<:)-2;AA`̪ͬېЎ0-%$ RS#x=H<2$gJ9#?PT6DQkh B5 jmJ$%CʱڞrTrsP]F%'$M#vTy\Ta A1x+\z1)O(ZQe7[Mwe/7t4Gz=d6\ rEVu.-l52ӛWk]9fLn ը{<7֠RXv_3Wg:Eۯw8+F&̀ c\|3^6ӗ=~bn/֤w/z]~t]9=qQF!f_6tI۟!>ƜPBtkܴKVq[g3hP̽"kLXNo0.#uE~[9J1'5kzur qrMY'L.5{wA[jx?W{wVU>єh˕5M?L=o嬵Ε HzZZj$7sPX &`^*GDDK jXU>qUT+n#>1 Z Qpu$ygiNs#B9Zjh騪F;VO-WO;VOyѷ,o) Wue0VVe*{UrqUy5u|[EDSYA Drx8( DHʆq˜h$RZ@E W`]PN[-B/ bD1/)vzb,b`er6Cޕ$;q wHDn=7\ ,^g<*}hpKi1ۋ=|s ,ģQDQ|dzu|`$ziC$H4݇KH '5*AjsI1ÌN%bpI_,9W3_k4՗cQr"Qjc =x) ]'b6 H1pIx0J:]xxl!ך;8ϛ,ҝK Vܡr*3o^]Qga+z}yт u`UŌpWyVN.u;RC>WQBE %TT"2@S7uNH$@k%C`4' " !RUW2p$)}ˬ$B#@=ͅ 1qi Om5km9Ԣ:g;bDGߐpES*Ҽ\ s裉enzxd݅o~pm'3ό\O`|[z7ti;7L\|72[z?SOIs۞nm=>/'[l0HlZn6?m.}FvݟN,|>pU O2!uD+AVZ݁h-oҾlwYlj-{h)>8b,@RTu)F #ˀlyH3U۝|VB&\ rps+'XtGjQ@õږm֖>`#E}r56xng9AvhCQD/?__)$qB蓢Q$cdM-Ңp( HI*tULv₥2yURuo"Z 8|>`GfLf 2UN\hk99x) 8iUlڋb9cSr]Q,Y(ƕxؠqxջ9GSV`P:b1V) PIGg|bPM>4E!fgqg3D'=ܒuG}>hZ|yd~'7yX+^dޓõ+n+voěvr}~|9p)eڻL6-,U^@dsY a:iLQ ӊݎ)QmSL,D $iW(= nrP' CHq 8# AUWaP9JY,9KBPNQFXSF9#Y}c99kQ$flW,T~5)E}NEs"7J0*(rңpd )iMN(] ^( BA2NBTx.`hQZ@Is2ZxImtpkT4Y9uJj!ӪxSB ?LHP@q%;tmcűUă'L,#<"ݢ|ARa{@2A5Iܤm8rǚo-7𙁅̅oDS%'j<AcsQu%HV;',&ÉmMVm,Kt~Vxu "4?U$jI@B 6y` d*7g?os|ͅT q 0%UZΡ.)oSMK -3J7R>.VcRv#!C{M$e9Ა^2_ddSX&gĔn ZA=Reg(t~6~G}^<_TsLF\ZOS!ѳy{==oWMln~|Xuvpe=\3/䬘i|W!fu !֒QZATNhR J÷;2D0{7`*S2{]P:uԋ*`nLZOS\X] ڥǀ|Q ZK2Nf\YKNNHks6Y+W@\ _Vۤ3߹;ڛ ٿjͬϻ3Ϸ͇ma{ B 18.7Cې`2s^PZvb*J8dh`ީU6yro*7G~3ybp]%I57 IzI"&yx0)DAփ(e!~+ 'Zz|5KsWR"ؾoDZ{DF\}x{̠2DihtH4kQ|&5yҰeH}=:+Ps>֮ɛ˥}ZvGH]w/?fttp<+Xx;<rĽ+3eQsN}~kmҋ~{Sh߯>OOzw? Z>NK6?ٙUv~ngwGbۢ.LxtFI7;?jw ID'{KYɓ 9 e`D(P6Bt+R[kEHhFrBAb$ʾ %FCbi/hx[+&NE \2"0R|?E' ' D; >L+@RP"0 Ť+[X䭴!XݩT{ch)HNQDD1k+RbTRQE*5h11 aKX 4MrZk/.CZ4Bum֍кZ7BFhu#n֍кZ?`M#nFhu#n֍кZ7BFh9Y#'k֍Z7BFh@кZ7k4кIZFhu#nk[>dWAG:5]v9/\KT6PhS#}wzz7>^Z](x|d{c*&H@QICUAJ,* ”PV!eY*%E<M)+MU `hfbobI<_1-gjތ;[UЂJ:a7^yY,QyEb&H^hYZޤ\9$tqaZ)ԓ|Gڦp#+d]P" )IBLtR"KD,jS%Z ܊TUVK}zu|t0o>8:?Lj4D_2*d " A4oNb'c Xі,IlLZl5A:\F'kbs*Kȋ"&H mۢ|e4^@c(f9QBc}۔1뇶]|ZxyoP{=eGe?=÷hzW?}=h{÷%XSfsſO~]ݨgB9<ɫJm==;QϨJЗ/qeO崷?)GiӤ~=u]RnoZJUfox:(' g?;?pi42xulg~9j^t!8sΆnew 8{u_ F&s8+;%N R&^\cnҰ믗ɖL{ǹ_y2K#/|;dyxEkفnV|ˎ=t]|mFni7(fpi!x^bk͢uŴ6\TsY?I2U7.3%Fッ㻛Ӿ|{ޭS7`+[V~Fkښ5\]*Ƹ ^N\T]Q?LFfngݛB(Oe2O'6ڧEsTZ&gzx{+wipɻMuZh0&QX%EQ\ Ƣfm5+k D}I|E۪ݽd6=n.l o0"HRE#/yd15Y+zKjGK-Mm7ؚ--mrYuU]ZՃDQ$(t7+|TrPѝRu@ ;bU#ޅM޸yx0ޏܗR@CEDS9 ,j̼CqD*Uj.@HF"ЙEq{" d<lt>YNc%cYy8 F%2>'%8Tsָj-[ w,RPQgvA6٧Ζ0dPF?#lb/Bmx|"NARXt$2 "('C<Ő*&F݇ :2,A6'T 6F/G"$HS9bDx\mk4՗.ch2QRYz&KV>P I"H%^1^hLhBMƶHƎr>oiӖ/:ZK z3C*j;;dRPRN7-}%N[{\Ilx#+ #3֫E̐sF7!/ 1 H;kR1#0Q(tD Z]5z8u^ =]ݔW:VSwS﷏.eQr{A `.7 <Q]K&]L :YuSڗPj0l %l%l%$)jk"LY huΡ)HUJDR Z8-ͰтwF͹YS<oΊfVxrw+6=r]}BWW]k}Gg Kx!:tZzJe1;B.qɓW{^Ϲרy2WՎ>JIF<_S z}~4ORtݳ˫22Y@#}έho΁|oҜاgJpzbBF].gS@C8*r23nĕubNe ehn1{lZ/ώn*F4fƬ.Y$ H D﫾-,Z(6k} 5@ mN 6G c2^0%ݫ!Vf@!lOl]{Yw(G>tL 'I ŗIpdL>>$ՠ30U!0Dv1RUXA Y$X-5K^hqJ:ԑ X|g4x\ -Qux3\EWP k "8Y:ck, Ql-uĚ{:cBVOYi+s1T:<P{ (sNj2cLl: oR"~茏9:!cw~j<K:ā '}D12\z(` XIQ3NQx?>NzǽHy cȣ*F!-JK!qAchFeh8F%i&>0fG͈Z \ʢ*(]*-]I k0u $;Ay'nFV./f&V۔9F>ATw5z܂!dCO;K#6 x2 Qpki;<㑌rAG3 y+:LctqibԄ-<}% 'lrdr \ml=I\-?[iY~~7- S0:XY2P<1XtNzQ})bA$zӋR+OQté)-d{CH9f.(*d"/H>Jbq&¦,}p釨#}<& 4-zW5RӅ~|ɳ{jW#r)/)ѿ~ww#Ы_} p TH@OglM̿%f/d1mu=Z;i5J;Nr{6JEcq;:MLfP"hd7髹it, ( "Gt@XZ2ܷ~N5hX]5|6`uU$V:&1TEVL.Eh(*9@.VΪgg Z'`\jfOsr3MzݬٓcazǶ]9nNN֤,翮&zf~o-d㪘e(D dPJ0b*N"t2w&=;r==ivD{(T,8blgSGRy`&+ `\^eyI0/1 " ]Y$t]ytMA*<t ;ԒdTi:ܬ;W~yLZ*o\D},emORXA={* 0Ů(BQ@_}J`Vdvtz!j-V$TLkPnZHI80@ǐ[\0"*%)S3QUݥ.|)l4'e"@cDa Uٗ!`-証撘*Jv&jVfGk Ş=<Ȭ˧SPL-hN1of[Ztuwdg &/E'G,x 5b: EO)\U)Tz{ںqe3擇5^gҺFkU񈊟'68XR+3Y׵qYrsWN[k'ZξʾOnCzSO|+os.ǭ1 bFk"X/,UU I%7%$cw5͆?pwKLFcҚofoՍOIoHFF`3RS{R&4Gz]Xw*E} ^e!25 y hp:[%dϽ?E`n+MIgݖy2Н͘#,XcA.ᲈ!@!Jڪ JJ^!Φ{ƴQ(cgB>d nQ)Ⱦ7+Ih{4m)хq-Ǝ?e( +s9^gFJfԩN.KՆȹ*$f]0ksp1ER[M&'oGVRpz3m,d i9B~ۮVRHCY8S@G|Fec!CֻjU Mfod@a`>w-^ }2yͅ7Z3hF>&߳ el1 X,)OU' D7SCzև/z1^d"#fkt:`ݶmgZbZh?\K_o7 Rָs*k){Z\0L)F.ĈE#%Lͦ(fE1ш)>zxbR(6k} 5@ F rLr% 9Jie Jl}h3i֬;'͖3c&'\S|<(o\;:bfd3o}j>2 /:|e|D5(>n0U!0DvN!E"ylV TGn381sb('něQD* Q"]Vxe9>gi,$JΘe8眭3&d=8k֝#qOn=ZK@@h@] F}449'G#l2Q,BJO_Ei|VIQLeD- $DRƁLSfT+O8;Ͳ^ipB%Fr aͮ%P(- Vʠdh2q4q$&޸IV5$t |5# \ʢ*(]*-]I k0u $;Ay'mUBh;yF}YMl [oSx( ]ZnA'ᝥ[M7XF! 1s-mG;x<1o~,0aIX^ޞeG9%Eu@aI$j&!'\&H-@Z|:owl,(Z ,:'(92|WgvB6sxYO})~݁P)Յ(Ro4oR%e~̘DnWsӀSi/XPD,D duNvM=7a-}tL^[c7ĩ<&H0]PTr\U%ufskXk8&C:M7k\Wd?z=9>~l,ߩeJKK~f^]5 sɄd ,C!2M RySuMv{Ӵ4;i޴y/+e<ۜ>ϫ0Yل`x[w׮p+%X$4@tAdue4Etx)DoPKQpSyǤEowٲ:ZpERIS8?FݧfEfIL#4[HjͩCbPU)QVVer FD7Euz&:r7۞QU= Fsb_ց(,+$c@҂..$9dQZăީ,\&r}آjKWew_'Di7i/Mߧا->FlQl`^HEn 9BZ?ͺ@([_}*t\.#dv`v`^;#\]_X ?Cli`0cWSCN*͓p԰>WtQm*׍^8[v{ K|?*v[o>KҍWmwz3i<`d_܅u^ӣu8j}l8pΧmwO{9fKq6Znuؼa}2ߗ'68XR+3q)jx_{. .OjiKuD v[vyUUO>iXԢ8&N̍Ƽ+ M6`9do׻?{Ƒl~ 0rNv 0B?eiR&)>DI|I=2l3~VUV6@,7b鸍xv$jAH.:[dNe>(2ܪS@@h-Bk&20N eNA 1z;lsC eFF5JGvlR D-uJcaPncn1ͱCG{BGZ(w-Oÿ>fkm&p^A _k^R\JNuhȨm 8iphS^2Zz#+HIdw) e=T8匼3nb"8X3\8_d>-*1HK{Zy#,"3RLYx\ E&b1XX1o+;Eo_B3M4.yëZlVpkZ|7o=dn#qsr"a ˬRӵ)EL tZ:!Ϳ8.[dd$#+ȮvEv[l;'Zsȳ2KN! TMP[ Ua.&.|H,fwj}`3sZ~4@CB~&e|`4,W6ڨE1ĐL0̚DІt޵jPd:DPpȅ`I6HP8UJ 8c Z6(֝-cpR(,ִ6K"׳k^}7Ѥ4Y^}y#e~sAmk)wL%xQxa"! Ȝ 2x 5@AhfqG n0$ rci%,:PYù!@OZfAtTɓ#/2Yug¨J$eÛ2@;*MAB~mӮaL>B=p\ɤ5\|9K3cv>׉" |pEAiBؔbkd$ |V˯ PƢ>*f6žߗ'oU"VK}v#1iwWlyU?h0ow.H%نÓ=jpIɟ_N\˚'1`~ҊMrۘ'moTJE7ݶbl7VV2zS5jz߼j>"bd*CQ HgmS>[zZz6sTitk/?'_O>}0|Ko/n~'mo5sU=/F?o,k8) kfBoO/n1!$ث~8{r,d\g=lHګ5|;]qc_G|<݄9>T +D-;6xYڟP@qMXwyhwfwIVUN' iu[̽~uwO[&zy"R ΃":tXW"EoD[ u&-5kfuurpHy[D!,M_.A z2okO-mGs}s%Wc"~j_e~u-gv}۹2׭Q&&V9hEQDQ$:3!`ɃЩjmiu>((t(c\Ȗyͬ}]-wu"yxVH57"SwAEOi&=E=E3zC>Ơ7\yQNjX؍Օ6^O+|yK~U"XrURsQӸxm `E0u!v~U_ nzg򩧻|SUxS9oA`'cZ&+YYt['˜[*,;9 ::7NGЋYN$4YJ$<&[ug3]I; z/=rHFfT.H+~e9 (3t413+睂؞mvHS8$aD)" $>mBP1$0(0"5Z>|4YtD敡Mc0L k L>č7ˎ7_7cHb;-2хLo,763ڤQg$BVt|wBs??}vng9Cf:F1)ن8~J1d*D+h.{^6diu*Z4=i\k ^T/o\8`<˅6cm";~#b-Ɠ-0NjkhXRs (qco`ǧ>M9kL罋ӳ:}-($bm%IN<o3|֏`W׆^qϱ(h |4;(j7own NhV<OzNmg~;_1;5)C9'N~OGuxH/O=Ƥ(Qdu)B1 W6e M.2Nv 0s:pML]D-xТ<"#V` e@k̢x:l\s˖;raZ{R-:m[[ĮƱ̣g,+<# % Ym-r(6D6ZeRvA$ `\KϹIH-[^֢(㾲P, u' w* /e.M^n6ɟ}7;?[F,OEIm!RL%uYCp:ǒ6tƖ'urKDmpXV,)  @*ShiQJJ,68㱎Ǿ s3 F ɀȅB jH1; [q qZiӞ4)Z_$Mu^О܀a.g^ 0<_UIg+LBC-qetO4gZUN2`r![5N& )$I99zP;Kǃ ) B9d,[ %l)BV'7 6Vw6[n<~$m^f;- J48&2|xWӧyOW7n1n?Ƹ𝡣k|pY$Z[mRZs& Nu^:pgpd[ xO8POU#?{q /m0 04i m)41Ƃ%J%΋C"%Q".N u!\I^S %K#.l.E;Pi'ePS (iM DFsJ֨)Ffpxzh򾧀"h |Lå+e#g, J.)xᒶ|faUӳ"s;qh霆G6"zVଠ/g }oɿi7={gfF{;-XHOmS`8GIPaq} -N#[T踺?]VģfJ+2f0E(ܝl4Z4&ӂrc}K?[MM1l qlwV5\Bbl_өAsxxz_I ZJdhWuc iѓ`\IX8iU>y1~ɋxu!_PWOۗ74~le9T!!sQ:]zXMY/sIJQl J>d>)ey0Q˦sj sBr >Bq߹~1e aZ1}%o^j+nhn//{euwd)VJRDaeVP2J:s2ԕhLTI\7 ^0·VBHWYȬʖ AԬKd񤣮>hwDV&N[.OcP`.M\_OIXc m؟svQHgrJY#jQWdJe 5i(NwҘ׼ac ?m1ތ5qLfG-?^z_? D!h9ɏ˱PP^,r{^-׋Ρ޶I6;zQhgO~=/q.6r?ӣysRoF?"]H*Gx{"pwiw!>;β[yc|o'4Onu| rB˳19û]kFYSd?yg[Voy3 W?->ft0髫lO~ j2:kp.le{XwFJß|<bg+}Ocm̂\Wltuχ8ǽNӆ/okr7Z6 ƊmPng%0?&fw ֚mRw|\Wsۡew0)Wb;3%PF')wwQSc|X*E~_F&&"Vfz?p^JA n |?'&.e2{|i\Y}"^rnV#8jq֮JS\L:϶ IXKjgWc^Ƹߞb7@1YYXdzz]Ʒ`蔷\4ҔOJxbjf'jf'l0=jC+nnkpsۣdiIvd<+lM,I!cݕzgndr{%9IĒ>ggp_n?v!SIm7d+=UG9;lx,+ [\*N_WXsGǧv#xQ+y0}]y><~;Qi37Xtl 6=_F-d!1rDYB$k{8(f-:zyApG?D6b);t,Zw Vt ?dA Qy9˘E %jY8#.(P.$ja7[]?[M"i`q0rn8Vq*V`5BR^ :fPlK,6qͫ@`.w:{J?.Už_-HP62PJC9)8P8j1kP7f8A5N2yZ_qI$.Kj5H.B.E%?$[SFe$CRR?$sHw<9_TD\q&U&g{B2vʹD߲vzEX {NxHooGᮮ)?8>~Xha!JH9 yW.1`Xi C=0@5 ~?De |v6Lc[ LZ\Y!g)P⋽Yŧ#.taP\h+H;{Hg,3Ŵ|z ƽiیh-ی8„7ج'Y6aXG)&9' Ds]*̙*m|yٖ |Ը3~=; Af.߅/^jCrfc䥤8y!.szObpp<,[/gUKmhX%$Vz-:X(ؒo_ /S]yqZfnP+8'̹gF3+.e+ (l˖OhE F-ybV[LK>MƁB؎k^yj~A 9H>ߛK`YG6@6}{7!RsTJ_W%*2MA[%SR5H6ST:ݵn'r_W$5sB,H|3 \6_[9HY@ՖW-k!%d(pTLfۖ% SL<+QBr)3EI#)UB:Ql۾kGoHA'zVOBzm1Gfp'O`T*Hk&a*Z($뒴Զ~we||tt>iM?zx'%Ie]5egj$Q Hʰ0B `-voKH(@UJKf`j $ :M:,؇Sºl 8:hyYyI\zLŁUHRJn6Lm:T&YB+ %TT0!o YSf,4ʨXjP]bdsmG \Ev aì D[*xNc! |$Qi͖uH-R4֧C\JqYֆ^IU&'8Ume9rm-šÔH- ^TkjفM7R@bvc;"+=qu~ 1RBUĂ,bi$v ԵBZ%AbV⢧9RTaYjʕI@rcTИ%gEKt :$J (Foua]J aX^&qj͹Xo rHbK-U3ŮEE#>p `-ȩPR7RܠlDuԉ[@alMlG/RX<)U8ֆlUͩXbY qaj5Mۘk]`>SIk g v<˔{ `Y kXJMIxo-[rW*w\tt\l "tQG,J1x7$m30.)WSPNV"+!-teZn>Z'5!ZbIyHEC dx&*P ~ P d 7a鵷C1!bm-Y;a ;b:;1Yv9fI1ΌX% ~b Ak*,S;f_m וֹpS1D u1QW/3Rl nIe:&bU]F)p5=((PX}\,05)H^bhou5 $W'S@">;%$6J'@HN =;os;3;ombuq,e-$u&l`duP7v1 |)-bVuqR|cPQYv"7hq"# Dhߛ3҂wnGcUZKpa&d*rri^e ̍)py X|A j\R>HN&lZѰ2C>8} ,]'V;Ѱz{΋EK2zJy[f@ `mM«s"Y)՗g/9W!wFzۆ7Yr8k07+X7'sheH9A8 a/B #tIUSR"8e@7 &bGQ v z x l+ hX@fPhZ#3ӎWNVU0-σ ^b9=\DZP8$ -Ʈ\?63H}<UD1 |&XN+Yn#IBs׾L0u>*,2D)$ dM@32|<,UX3 EX'MW`LQI4@fq LYU[f^EXYG+M!afs./koNrK*Yku0q\X ArBŤCd״H\l$hl%X0Ե%骅 5&Sm;}(0w`)jpue I<5HU V; 㺁("7ofnCx xS!:3xsȇ Zٶ5$V6UJr) .x*z:)0*xJ$Kk0z wBA\mVCa2*d>^W'bʅs " +f4SNozl]2!v*D|C!E r BJҐGa2⓪րbqg :U[ c CE g9Ĝ" 6H@s#SԐc5zY"(ΤX`ҧd:VMI ̵5o{jyG ś)KVc!+<., 33A: fY }HVX4BLYUF|4LJ,6MsOt` Pb.Bjmr%A0i0?.itφb`bUq("(eb3+!" #`Cd7]bN."[ AAl]튄X[ B5jhR/,kq|~ )('#b;!s5<4<$~}I1o2|>q:cIpYqzfy:ZMYV俆x@- X\F`v[ߝj}!^D_+o<} 35X+"pE+\W"pE+\W"pE+\W"pE+\W"pE+\W"pE+\W:* s\Ɲ ulW L#6S=MG}5h)!O"γ>(ó.Ydlj y,trp:?['ozԬdiTeΰ.dqEj Zmre5F-j8m&ƥm~MO{/dn~ti~cwCy֧oogw(l1'<쎩/~NX%-Vܢtum8bT1'oB ;?O9[aIܾ0ݰ[v4ݓ3ܾn-󭮎ϱңa!޽韨0s{-#w8~vxkkn5t䏉iaLt̯lǜo?f[?O'n'Ȅ>ǣu3a',GG:ϻ=^1hG:`̍OQiGDBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(DBd(G%Chω sƜ  HɓOD2 0sPBeGؗ~? Rbյ@F%Vy KYYRD,رRIR̜w%,E.K!d qt Ygg ӮMMRZd R'YM }x)^!ѸB|v0l?M+7!.^-Ǭ'J*nيJෝfAPbŪۊu2r{Q.iVo8hJMN4U 585xcRe㵳/Fձ9XrwD6TDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATDATǤ8wFT l~ [>9|a܋{aǟ3|zN`q`P+)+Xc[=7 6f.*D S<}2{oڟEZ٭3f]roЯl~^ _e5736\a/jC,#'aınC|z>3%CXƻoÞsõ3a"5_6\}Lo;w|ֳʱhjٺ<uPպ6&,6yчel&n˸=Jy[lgj Mc[h>Hv{7HrHˋXY<>"N㋿!ϷdYd-6/՛\*X.<Yz3&0slEY[aejjU2``2F!MilB^$9_EyS6"Yb7gŎY^K.Oj7Zmj[0UURtA AWX,3 KF@> t`>!\!8HH_c#oHHHӎigaGPs-D{3>?vOKzqP s?իRY9󨵠0{jӼX 1JnYZz$qIɢ;K\u[֪zcVH}: q`YKZ=3y؞T;>dL2<`jH!rȅD8K&bBnOfΫ-x\z6g`C_y0YIZ ʹ RR5!yU⑔ M8)6IZ? &=D#+QEbu!3c7{< yXU6((E-5_c6ӳa 8{YW57 .f\+kI7jI"sU: ;M*ROiWPax#$=(>]2q1o2DX,dl>qe|*d3 ’zS`Pzvȡ"G<[cMa:^nC~Hy5jɖcsD;*Q@PV Tl6&FHc\bCɓyKZdgM!7r)-zbw,zpi7=Bϟ_G4ޠryf0=70!ѧE|{glm6c ct\Y+W~Ymz v-HQ^B7 $ꜸQQ`{>ܨ(s *fN䞸Q? 7g2@|ZV@y&Q|nZtک|u"G)HG7&MoPڻoD )R"*ZVUb/ƍ;+ki{ "f q3Aiw}_ Â1z$ò'{xxL&3+/zV}n-] ^me6vEU洉$_#sG"'N|OJkOJkAi_[ʅǐͲbKF"mʰl5)vY. ٴ-fia1K3qv vJe#z:_gekaye)r_Nbԟ5 Whi f׃W(^R)FoRP>S&L˛VSRJIoR>ԶyzgeN{lR?dւ'#-/gVv.TI2k}>-N$'YȬsqhɢuWajRgnU Fla\" _f#3V}MA3I1I1N1e֤#zuƊEJ`t⧽1l}W tP/rI- m := L J[rdۤC-xD|0˖yj`EGiY«#0W[$ݳðbǜONWfr?f9 g>Gq=9JQJC4|4Dq92j5.9Ytr tJښjtL&ͳ]$+**/2#ԆGE:\r:4!P=GmlꋄE Z+ Eb{ٰG6p:Y{U9&WO)fT WU>OYKuH@7Wּp lrQR(j'%ubeUFU $)mLYP5Xosc:.EhWaC{KT$PK\,V=Yoӹ7ٮ՗@V)Lߢ.G4cTɱILI<'jm{=C΂ɅA1}Cz/ =?{6.i1(dBԍM ˢZTp@le A"kZAt~4Ict9VVb"q>ŕDEoLD=dh8z15M&k`{5t|7׊H!WQS1sH6Z T1[9ז@ As4+"2gkmCQDʇ=+ߒQd]3SڎZ'KI㚡=9ˬQ=S 0֥Cю$zA?NcMh/nHY!lEU7ẁD˒[&$q;ʍn3:wti}Iv f Aw~vzT{HLZ~)rKD&`a_e\q>=9Wr{П%:)XqQ:jc"g(ۜ Ur1j HDzkrB[05Ѷ MμFW ow# bۂT Ft +jIq1@,41ЈLQ^.^ ŁG,;G !Z5Bq:e&okqqF ۛ}3ntr?z6L [?.WZczUϣmH˿HG&=*dJ^1QxM{;^ӽ5L=@U^؀XyQt:Ykv†*Bš t`8D[%Ԕ rjm@eA{Qk1$ˡdUvddQ3 &F/mkߍG{PU,twB >_gCV LD8O_F ]BE/U1B>!&b7ʐDqTbԥXc_ٓ s4W6.jsorrr@Vɜv'arkABCDs`)JbfV ;@h%E!T}p{_#wuD~@vshڡIiz5>CT[6ͨ7#)֦}őkˌvOgo${az9ߣfQ1G)jd=j6({bb$.nD}y>l\Š-|rCrזGKNoSS%#WS:<=/ 1Gy-iLE oy }ˌd γ˰:f{wӳ*g.nyJ\6;N&cwqƖ2 Mgh(|BKw~Ҋn+E_'g>h\@Ptq؞p+aSbPeNP!|UJth1TEhmZWcv 5ó/1!UDћl<8LamՠE -fj[=o/CfI0%=h^6KzlM Y?Y͒u[k+ȝ]CZaeߝA`}P;(bE[mR])xTVȟwwӝ_|]#H䝳r NkDxFo X[32ԚqbRo`ӹG54kI--^k9?}۵6`٤>VUuA`qȵk%T~ *ºtZ}#u}6| =ߎ^c&t6a_L:T.fnP2Kg`LzPpM8\ z\KkLwlmxa;6)nK |zr(Ev}cCtkjɣ6mDEUDъ'(9M)fZR0^R}b{Jƨl˃@Xȍ/fϺfVzm:z!G>^aĵX%>MInWČ(oۡO,Z{qL5怅{/\6l=k{>>_|*.Xw9`{yY"_C۷M%$ڛefg=8/]h:_kW5mmϟy"͓ß,2ffKַOVB/yuڃe/tQWE⚿\SۮL]ۋۺ4)!l6Zߐ{$5ul<6& }]sn )-}O~+>o7NyI\+'"Vvjskv_=, FeѧE3Ӝ&JM&VqU0EȪDB(=2fe͇TuMEXI(hW%RUmEqkAM&ٓܓ>'S6O2ow[\5&}$Pm'2'yOyU&sTmg6`u(BzM.=;:aWsyDL1y 44S=$#~& L%UǹM CFUK+KOZ%QD8d<[>tZ\:IgTM& %3m:wexuzrt>4[^@-QkEU'KPh">f9lqcE+])gdY1@/&5e@$6PZT(Ɠr4c,RFK2Ji|^_"T.Ћ TCjkRD2 *X"CPٔ*fCQh9GP^hQΘD"-F7$UT8,c9 ~ԱYI%Έvk_k*Šq9pGˁQ oFbjgRfa5mWXn3[ؽAqxzޚL ~aC8:=XuvTbSUN`#)]Z=szgk4J u*BBbF&*Y,b4FUTz]X!YTU浮BuJ:֮6UL0RnMRGxo_~ K;XHm7skwޛ}xOh~Z$lR1]QRņ5UavROVyizK=@O'OCq[T#z6[QkjU}l5K 8E\Jm)?{Ǎl_6n[|(&M`q"C =<ȎbSҼ$q<3vk`M٬baU+,Fg3d]H*w:Vg6M{zeM}oB}ޤxuьMN5Eح m#.[»w=nQ[;{; Qߦv3u-sֻԺͤi8-lI GZ6Բݻ4z^44NCK-7χplweyEq[:>$Yw7l)ΧgMwݺ\S+eZ=rSUpZMD-SPP2r맦} S۵؝s}fUY捒p;2 1Uvv?6̡dCN ѣ22pP3( X;(RdI<Ū]WdJXJYrD:ʡRI+]N5Iee+Iʤ5Krr<&e}qLr<ǓHNrH$e\}?}ϯϗJLβA{(HoB ayƵ}[_<FJ4SXk!ϐF„AO RI(Hi^ c UO0:>N_D1nL٘o~< ]>ߞ}=]~>a.a@? F5[1;r1rhަQ޺iY0^j+DC6I}\, ZS)I@]0 O2gk}Bgca>eX2%ewJX pc#Xfo)Lsܽ[e;6rնNzLW.}o֌%zU5xj}Eșx9g2R-c5qvHK9-63k Me[h:[W[xzZ1N$Ҡ?/b󔝉 0ceT:q=(.71J)N1UFf[[Y%T🏂lJcѨ`TRU2pY(3K!D gA\b͎Zm[jjwl S#JNWpQ&Kf &FCC>]*1Ȑn׈Q 6tHӱNnΘPJr2Z%rQJ&ne\XYJjؚN:M4mvC$6N&JGu> Y #4JY#+"+uԤQ:jRGMI5&uԤ.Q:jRGMI-I5&uԤyGazr=a]"VP"Bg;:"YiuDH:"AdZ1'3vL".3H+ٱ,R*sf*C^W겮_ ~*]rqTl7tY k6'Ty!GɷWڥ|2pivIIآӤ\xghad[kbfSbm:pe4" "Ӝ D21: !GBe~3>%.RY8%<)u21 ,/\rt8;[JcLJ,9E>6ȏugwaE[!5V=-.wъވlRpt:bSBeN-qe )o9rIKs.4x~\Hw lwxOFMʌ.k|[3J\ߓ?-yfze`㔓|G)^y8z7yU[BOho\Jl=)>r<ߵ@JlϳoMݫj+V];??~wu|VIgٶT|;znn=Ds9o]_7zNjr?U?OwN_'4V{W ֨_Y򟟿-p@6إ_ t!,p?KWs/S;`Rŷ[/ˆByaKѲ|~>ۦȲ5& 5na/el~2!e6+$6F%._9^,{@{Y62M\l#{U!fj񮳿¤؇Xhkvg&vͮoɮ_|]3mDQ޾ __n&7ǽ~-_$>O? 2,%be!t(g#J^=[`W|P-r-FVj3!(0eA+O&3!Ơ"KzWp Ad~ %"t>oS"C9 ") ^&fjxǛ#e+/<(wsFǎs1d$g"t Hw]NS:Y&(yMbIOIixI9`)hQOPȭNW}J)W&u,iE"3ʝ"1A'NFף -RfgFCVho;;"%,Z:Vx[(`Pe/ApFȻ$"ȹs,l)hcq6<dчBu9- eR )+&," Y{Γ&l(`!AXc̢r(8ݩV|Ǧz_J_̼ޞ[guwN3~srFY][g"ع>r}q3uBo N&MםLVp~oBto7dfe߽?P:qFaM9ؠI9dPYWH'ꈨ'iUEJPIbJ421pA!Vɜ.wo ,΃"~X~b凧Cd> .[»w]/Bȍ6!͕/msץ׷0~Nb?(PZ_Z7֛t:o%%P?lͶCjPYݻ4z^4Ra4novw>w;ηt=6y{Wy [9{Y].WwKmm>el ?\VSwY`En](0D#76Zl޹>pvYgro},FI 8Gvvv?Юd-Qi8ș@U}f,IN)N@CVT݅*[( U:%ID  2$ݫ#vhs}v5 qۗm_{FM3iJ6?>]MD5Iee[j;'),!JN8)+u3qnO"9!ṵB$jD&v ^g}z\{^G;Y&SLz\!!Kô'DR!N"=JI*EJg;HR1׉b!YCslFk iFڌZ Y)=?9=(4 Pգ.W[YeAlT3 T\6X4@Jxh6`jǘŠbڠUPXڈ8mtPV9(JK?k !'=F.g;*΍+bd`-*.IZK0%Ȃ%F[ڀPXQV܆Qǟwzvfv) !jR-HՎc!;ͲQCRsKB^e~10ƢxՑ%ER$KV䲬eI6$vX9!E'39: sPI$fbA@@qviB,9=rn`dQ" Sd5Q+4*. Q@ E ܝ=Q=sX?O(0@ 'ah cHaqh#. A-3f;)I/gT S"|2I`gAԂ#a+EBs4I.,+$R'Wp' h(\*5>2DS!JQ$nB DuYҝb71.Sd,·('1u6fZKoNڱ0J`I NeXJq"#x,2#*,txe7VIY5YJ6r w8 `+@]LZʛbH9N>c<#J-%4B EDD{O6K Q9`Z[}Ί2v݆]{KZtp;@* E_b bpPABBTcx 2[jȭCJ %@]zz pQhu5qŃۢJkWĝ:|qbŝ>*HtQ\޵|}G+Ҿ^pq8{_Wnv }xhӹD ôA@lP-U/Ԗ mBM H\A:f[z_s)&Beym?zGUt KM "m\tSu2C:#:(gL0!JBĺ CJy-MWhV6 Tj?NQA0m % ])ǥ bg洿':}qA1頒FBh11DK/<ƊD(5fԘ |_I߈6qaƙ/74| >2<ĔR< |>0?v<(RPf5ld| 0O/>*J,gB\;S2ʁ$V\%JR!Rt@s%a4-UQK\٠4EOwYM~lXSxƄVdКR J H,,1 de&c__UWSxW}|T'Uk8^LnVL<2Ntx5+H ?_9(~50AOK/_+GzSӟ(7Ӯ!Y^~,'/g+yK]_ 3WqZ}6JлAwe8JxE-;;pY궿nl-mU>^eㅞz'R[Y*>k0j[ΖfWOw2 Z A($XlW~W5)v?C~w=k< "Tbe][1Ҟ#H9l0ˮzsO@)  ona< r u@9 n,ҳm(|!ckq6ƈbMnOTԫQFykF9oH$)N[3R s)8T&ߍZ/ÏՅ@u|ji{p= e)iQr,߃{s{Fq0}Qa9__b|Aϓ|%{Uj Eշ.FE.g8 ff:6N> J Ӗa{mD[C>k.r$r13<" %HBi9LjQˆQh5a6&U&PBndU@; Xxt7 8H?uG -6wl>aP 6,ed,Y0'+u<`exax82oL,wJĭtQRSN`hD&mS(Mbf1z)i#m(U8\ٷLuKفiZI蘘Y›l+ 5IClKn Lw3&ɌJɢI5bKg'MWcVU]qmb;n>XulI Zu$":kWbO(*57D' F߀\v^4F)3;#TU A@n*eL(!DDS] }V RTIJ}JrXJ5O ֫ n U|\a {Hj͞$ٔ,Mt52K.9][}3Yґ8P@e}xlpW)y%X w5%5:Z4]۵H8DbSi׼nR{4;+^ҧJIt"T OHa HDI֎ws35j]d*Kԫ qPW/ŢOĂ Z"Zj}b]/]BB|IȪJ(`.{.~\8O vP0Qp1R% V9r|풯]{©4p#鄍+*h" `8V2lT(Cc뽃m&LЉ񉕵%'!b\d߾m }Nn^r[<0:_ 017Iќ+yX ,Iz%WhOvs}`#t 7G9e\`|F#MH.gq:R+gTJ3N2(+r87؞ \`j䳘=b,=;5ZFZKt:huىeu= Y^]o\񰹘Pm)qquR (oI`xwmkai4uE!x==/l]}x055W{(i_zݶ؆e1 Ct@]c5 K蘆ɅᳩEᳩ HU_p6,=ĬұmSqrerUoJ5# jv`A Wx^7a6~wh a xUL@Z/>l)g bv.Z:RKѩkH%YKE-|02KQ*Jajd]Q@}1E(?Zlwęt &l,H.bDjɟ;E*[UгEv,OqPbBǓ⇿a.O/ðڗ_5WaԹYc  `W!Jؒ*4 @R^3W&.NAx򳚫CW;%^c-`+ɰ#1a1&5D˰s: #DFGr8eUߢ`}Ѳ/ZE˾h-e_F}*(y(_4,#,'h+gaɯs:'ɯ3:RpSTepQNQSTU9EUNQSTU9EUNQ 5ܾyiu{#G)w=uEfKb*а4 ;‹l`>k˜gdᓂ]lu[gh (`'6^S ÞI'PȽ!I!wN9ǨG-tjFi-!!k>S UJb%f\#&#_FEFܵh>6`PtՃF6urljx.]76=ذftOUa:y֛ȍ擛Gʉ@hsױObv1^]Gh]cn0ٖNm6@oͪв -ˆ[wizx{Sp[-7CK뭑{|ȶ3#l:ұUvrj텑;-F;DSmeH9277Ak 'MS* $RHlw0FK@lwmC$lwnUPl@5pK80 A ,RP;"Hc y l1ͱ")0 Ķ{2rzSL9~{ {@a羜Lw O] vWͮ0R}mZ8%6O_vVY aiϠ#^j 2"QcGwǚ9oJ*FBJ2'!dR*>t(j/ t(9B4 g9 |eqަMn"[G,)&з[kkPɑ |-y02Ix}xqaAX}6xfa\GXxC~(sxM^c5E巔ӉH~=< wkigK?;dz#6LR+ȽdEqIܤdr1HB<,PYZsaeݥX 2+b$ AHQ`S F%+\9e*gr̢ʈ]97#vҒqy(]M:vEm[m=Nu'#dqA%E`,L&MgIM? :Ywa;kc} @#Y "+:J5 Y4h8FR։E2 ՚s3F|mCAj+"BeD{D֋g3f"eCZLH!B[#YBYm񾅨*#bla:⾫U}\%=pzGv[]1Շ=P!TPEɳՏMWJg9wlkSv{_ZoyMiqL[׽:8NFA<9}~_~{|__s/ *Ro MiS&nϥA.=W1ٕo;)9[j3+=?{3;b{r5 QOuO\[TdM[p{hgk>NmmR?i0,Sw{#:AGGޝjUYz-#ƭzV cI)9,'ϞsN>mG _^1.` ?<3B*/Yub~Os muc/~!+{alDtԢm qt{!_oQ|ٱGG{Җ^C}hnRvo;kƛzwMыQNK m1z}^w2J9[u܁@ Ż ۈh+ƮVkv*%-p8v OE,g4;=49t7ەB'N@S>Bh\|kYM٪s:_nE?*TKj+Ō`"("(6VGLǠ`+qћjw[M>q5 J#LP.d`0(aDdh(UHY{+nG<~iO&L-+ٖ;gS©.t Zë(;U( k} +Ub,ih(`xbWU-6n3`%zJx"!Z孰JԸ :gɐ@*q)?'UIK9eGgYbDI|8 NX62PFcvZ"E̼:OVP_S #wsG *; Vď/[ShșH&Hyo ՋJAzM9H BH% Pix`KEQ)va]Uȼ4r>T-!!01` L>38]vY5k1R $!h> p $Hb<6Y ױұ ƒޡg~k_uk ԻXgCKgw>B<@gY4a+'O:^~[$>fXA:P&p1oέ٥μvSt]'/y.[M-6⽎?vڻ+yˏfg~jw0A9;/JTw]{y(t@8D񢩧{`ɿvz@9dQ06 pTX JUtQ& w0DkY٥FN+^r7_apM>3,lBRdq RfA*^ BrnZӒ SJtue!+nF)}WEmC*FiВ7bʂ9xՆ^rcwbظ#_b28r69enxP&XkYNV.{RCC:<w!)t  vZ͹ pC#.ϥ8-BqoJ8'Lz?t} %+$W57-U*)}**+g D74A[#mZՇm=8>xCW"'(9k%E!q$n]Bf-j=ڤ8gEk@ TݰHl4@J8d6`DJ}'6hYkTk? C)fV$S7Dr,{*hRvPL !SƥsZU]bDJrɸql.aF H u4!V39/O{={fvàĐ'f-0͎Gd4zDAKcge_ )<"U[&NZK1cbĞ-P^[Ϫ5gK=;v-*>Z9hHR X-RP6 ⊴ND3W*PoP:a"w9Tq>p8JיQQ(Et (WOQx6N}xd(2n8'X^@Si`#`et(MU㠲T/$+f ̀`,p@!QseB $9wke AI< {m|n8z@OEȮʧ++|=v Sٰ!8_VhhU"zƒ%EIYŒq"2-lcAƚ oD;IȮa*k2F?\.iKy/ ]K$J< 6y`j%#8}6?֥c).୔9sl& ,5jD5:#)&wҭ`V{Yoأ94DGB2y_}2 LQ &9M(|p뛨 Ү=@ˣ\0aj=78ue@🰥DOջ[{/_GoG }]jrz/NZ_~Tû% ˖.NwTDSJ"蘎8SmP%$h@rT5[h*7*2Q,-BT"XS}j`=^vUh%1ymfS)G\R IsZe@}j]\bYY% `\`On=wCF%φfCazn[fy[={ :KiOGȧG5>h1FacV6 R oye8mi*[ږ"@B@Y`A̕0wqD CcT/k&aNhY75כ`YA/ TO%n9t^e^S>5% ˲$yVGO1b\g4Gg4^8g̡.t3֎d~,}lIB;tۘ&Fy飳`Ԃ>Wh^=zHQra3~]V&džVͮ,7Mޤ(0_sxݮܕwA,g Uf$D%YaBBeN*a2z!1xr֦Bv9bA]v]ʙ$t9]sY`fBl^+*k/},i?~o?M|E+zA *K޵q$2?`3V AK egO%r(V+TtUU3" DX,-ɉN2liodA%nJ878[X Z&1rnټ`ŠzL5 3__V;šp9Yg]Wf+k`ƴdIo{tQORۆ@h D۶!O7gwY3gIo@U[/F0 7$^ݦ#(l.5TCpMm7t>6,xl$*d-K&i~yC-tmfb52Qê=vÊUO^;B::k[h7Q6|V(j=cޖ '-F__wav,2pNl9 6)C7E`oXz)FDPF)᝽ Y`SƖy˩w9"l:zYu4t4qu>tAB OdP"9TN%j=AJɠx2H*5}( dU"SWZ]\%*+ԧ`Nu" JJv*QzJkʟ5 LJ'^ DY\}6 o+9\\=`}hՓӨ0iTrt\ ?A\, &?NmCxW:\VΦkr+# Ǯ /8@;◷*Ǚ  $3.LSEʧ%M4,N7y>Ջ;,Y9~$Glo=bP @o>ܕzM\_+R[2Ee&Hff6'gXOm=EqBRen7r{e㾽SaSJPtiŧ+]0?CUO$WROZއ^So=vi]^ÍmCGQnw/x}MlMm3PfցrrL[HkU$LarX⬳qs^XktE$\.S A2 ǨU1kQRTkY0x];J'J:J~?pˢ'FGǼV/~E+KOw= _Yf9Ĉ Q F)qoxP&,1GG Q`ya,065<^>BC\shf1Z:R@$ȂTDV 3âҌ;$8"` (O Xo? ʏՂ[UOO>|0ffMpND$w&R‰+k1|Fd1T'MD(3?>\^ϵ5mHv )#1.` g9#F>Xm1"2Eό.#ms!q.J`kG J P,S*RDAr*\`+D-\5QUY e~,:v:aPo~~կ߿o\a^7}+d:$ ^!*9;>~?^b ĮAdO`I=#~[~Wz,zkFü6:o+~-IɬpuiwskozU"+9qWBrte:v&ĒQ-2}6j!<̼ٟd6Afn5oyW'GpwEg8 x?1h sţ}1abGGSuuZ4j ||GnLMt,X?vo`N7TK;j%Ox}ztOE}`Jrq~ !O䯁%lٮgG5V#ƌ(6(h,Qub!Ք*7д|-uX _t[k܇XGO7Ú m0kت\`\\(}\ $-f݇+~x{ǣ~%Ֆ`>/ޟ\J=_2m>|RKݯW'pU5%ΣTALuąBRADD(2Dor,jmie>ȳpTꬰLۨHg@EN!U W%o4H2YģϮV\KG=5ere]LG_{V27MW(DQ)(OD/ϗ\o)x)KL2Q .UB3EiedF3 D$Wi\GjbH2ϩVACQCN/Tjy"'^q<-YFmqG^Q 9HEbQsGrXA(i /t)1 R{w!H#ti<e=|`hlm`(g=Ar#;j$|ΛGd-h\IYbp "JFc&='ʎTnWMy,pI1 $ X B8m#QX`b Kn#N nd#Ap)BF~OmAxs GqŝۻtlwXIزn_`=0EI%3ȖXR @y+v[+)|ْ  2}WZs^c(&*ʘf fƅVMsL@6j(%rnSԔfLDc)Z )$^NKcJ%fZ:o }A=$)vE{$ŋ7 }}(gMͫ4W$ Ϡo |c"9HQ}YYLM.~+ߙS0';,cV@ a7r|ovQݨSFEEHG<68U 9cnq[݆ga-|/5^F).qW?R\ o177]}^*oէ^#Zj$|\ e:X:C5cUz2]ACO񅞰# =a-즰2E  *P- `o1h6VQZ4ż^ AHRVa S띱Vc&ye Pe45[!--+͟ |P^Ybgk{ʯq^;G_mItrx}ʭ*Pf]u ou곩pJVK#r%iþ*gHyk޶5|dJܢjezu0Nm6+t6r_;ݮG=OQV6y0W76&kf ]ӱ7vS\57. tS򺬳mӶ>޽gSB 6!5lJ<)`Hq׀5q]I8twyxkJ".uTZTYX.s/ݝg@^CѲ pz) Hz.Pwŀ~4Z('];aɵ-1",D@ RSͧu` Xy$Rh->ƕvAcCaW]_~vf,wJSωl3#Ä 6 XT{aR$L4p8 Qֱ!OYịUh4Bk +( FadOYEz^>>N{\J;w5UbV%Yx}_^~N jIU d*k!9 E!o)sj-qDQSqTr &T ޠ刨(uvrMa-'E^2~q5kʹ5'{_p^զENAT^g ,r^=HŘJFXe1yšs-0\6= xQ 7R Zշuj aoRYJB6hii=&CFù^kj z5;M5PPt<3j#xN>doUN޷bN<`RIWq0sމGGNXF}y )! 3 ^tkrQqXZ;b.$Nu;m:©of4`<:㩈HFDU,+^gQjB`mt D{6j1Z ]REe\WDdJyzѪ`jÙc"(xPJzFۺ8$Sbݦ0"ފ6:^R$\4ꌋ<∋ YOW܁Rt7*l}Y^czߞ83o6榒S`\7@.aJ~Ś"}v܆)}@ggÒqukH!E9Qq>V h #&OKsx5zjP8jl 5FG{tH =إؗ*3Vd45> 9geGDU9&qYkL1bQ+am&t9jn7NigKDb 9 )ͬ+S0hREgNUCik<+QR4AˡFe*6Te2>EJs)Hbzm:[GPrVN*٤6L„0)*b!gCܨ=U䟞!+,Ơ FɄb֥VyE =Ple A"kZݨV:~'`gm=ialk-Rȳ-RјFd]c^/>N轓L<1%vf&|) xbb,rH6ZW-3lR@(Q.fE~\MjPM1XFC"26XZ9_<6Oѕ|Ty=Zǚg$p#P1Kuێ6!/$>{E\LZ`=]*Q{/tv1 @Yjezt.,˂ttt8'߽m7ܔǥ+~W7g9p~8&*%JѮzc53`M[ґRRSZR t&eMVG\ '?\-j EC ),˵&b{7NQ^/zKgSI֥zO !Zj3+e&5<zx7) [[2}09CQ첎#Qu<Mhfn U>kbf*V(GU6^FYS(c֮E:r!1)ggTuCzM }Z][yYOrdžB= û7 ̒^'1#__e_ϛ۱4ϛu^,f敕Œz7sz~iEEX>h/E/?9v$|.}roˇ|XʱP1y lXzŷ^':" E-Ttr?ףeYhwt󑌽cU6#ƣg~] .ώV^+rG|v9ݎ %eC;|hϣoƇyNihDml2)s^DUQ .Ql͜r|ƬPWNkހ6{WT)ڰ7L $3|RBUyUt;! bR|]7Z4nľ0ytwPcqՎWIP/ C_K/o/ +R`6&afc=SP1fm,ݚ%R)ˆ;A&ˇW0qS6V9qGCtr 1`K 1X}t*%Tl +0hj}YXU(f Ú35Nv-f ŀwtuo;*`ѡ˹aZg'txȈO}谺=75p?a;nƑGq{rJ-ʚ|r ;B%FD(&A[KJCnD>s6;GzFykًT݇@k-˼@\f)r]#Ztg 4Ҥɭb#8^bFi?GpPse( @lElͮT6QiCRzw JP֛bn?zFԣHQo4>_ڔӓ rи"sX CV SC)V#b>kE %1 }2q]xޜ|z{qN{?v% *x$hCQBLUtY^c:h.$0Փ᠚-u'vp-=g_kG^xXrV?nwi'a:,͎o08F48wrޣ @XnPn=8 Wj3zNkrikUKU8yӝ#)lT]b ;l^3Ԛij1V{s[ӣ=뙍/VRNuοٗl~,a|%v;Ul4#;;m5x)w~:\ξ;Rݡ)/;DkuoMaGr4#t %kPn;\t*8PT8'(h-sEK_]ggVjz 69|^.Xg%TN92 _qQj1D "b&D¶>ʻbooġYiذ{}˅~h0xbox9wѿٓ 8Cccr>|lRZ dCs:o{ku<۶[bQdZ_7#δDY_e+(JLqU0Y9eQ{Fϩy[R٩ 3jf`O1xR4 +0W|΢Ҏ%oI1ŬV& *,b>зć1YGaۛZ1*[@yp`#qۗ@x!m i=DY*Cnٲᕜкu=%LN ub6eOwݿ3K,:}vS7 1Wf?o*zhV.WQg] ]@"hE{ms,J"&&s\U:$t, .%HCJ .1(Q(B5 @/ jm˴,fe[&H 2R8jrlUĨ5(o l|u&2/?:MiFX jBj$DT#̙v6gZ{㺑_i`|(zf`f N`3@r-q߷x-ZUپϺ$TXeQQ.5V;I+Q Qdn)Vn&"/v%ڧSSol(ZhFNHbLo6RpU" F?h4)Nt__~חR~q0ʬ XZnOv?rtr'oz7Wkz~9c{F9J z5qʯ+:;}8"/X:gSʇG2EKjsۻ07n [ zl^ߪjycͣEK2h=_p|pp^{qݾ&';]>^z_x^yRNj(N~^9rM^jt4ջ8kl iMp5ZQ3NO><q6p~t9uΞxrȜMy_Kvuǵ&[`πrp@sg!o!Z}S烃ս˪~{G;@X+#]9Ǖ)q3׺xۗk֕#nM,EɀLSQ!T:+dc,9@3j籴2^}~GǞX@1dB* yakW gA礯񒇥&KK{[٘en8en,0}%[ɕTkxE0QTy'|x*(N΀0*wf)[}(Y~Dz$Ll<33h &[d{)CQOR(>h|HRL@g%rs&YFJ6&}N?i; `P$j0%X!i0?|iriBeRRػO)Ƞ){@ 2YɐH XdJdBLY J6dBU,Z6#gkUWXD@ %bI%@ Xt5حhS[վFuFepJ,I݈!>Cْ9i[&_>t1h!ju1 XRUFmTlYG!2K)֦)sgE%P&YaQ)Iy.z5g;B떜i']<{_o%=P@}О;;tq\ \ ꦟ&Kf3s?k Zݿ\GkZR$44dF GTX32&EҶD٢@@ZH<'#uS /R3T1lr>)ט6gGx>؇op\kQxq;ݛ-{mC&KwxJm#}i.-^»?zrr?sQaymn|EkEM-$V"[=e1,]_L{Y^ͯnX{[[ckO|~q*'_0LyދqØn-9һyWCsKw] \q*kx*> tVEm6:@ژs_ t N.\ox~dr\/y[>u<+u0M?M`]o'ocz:;or7i_i$R [F׳$ʉzs(~|:_^c[K/[˛o7YL3-tcο`P~hř^/Xv?Mdz9 F/|͓u_{uu<h}MW<>'K׃m]7 uzf' \xPd(< P\vp5>ЪCr0, @g k(!HKVdY.)b"ǦFeҶsF-2MD]4"hCe ]8M;j~%yzeCBO?gSu'_s9 o|eb")'fYgZxx,HZ5 N8N e- R .(,F$*#Ju%wYgY_WocJ:hkpt]ZoS=_@0eܠJuA}Nu]ucvdl1Ci!vek1.rxH *nCt ܯ9{j*0[m6vV?8x]ET -oAUt$hkx`F e!% mK>(\$[iK-ST(LbJ[ `XZL*J}\M#MQy1Q$F!Q;=dx2C~}BBX?B 2O􍋶9%jrIrRʑ2bQJkbaj!ҵ,Aȸ-X,W,}zV֋4?c K<~A'f_8bK*eHy%md+Jj&#ԬoYJEW.j;:%UM<:^\ki(W(zo+q#vr_P-jƨy3ZX)Sp&;lE@Yؤ9,d%:.`|l;PQg$)61qZFĔ٨&[/`7g;F}M5=TqoķEDh0  z4O2*GQ %i#kN8)9+Xݳ BB6ER d8֪*gRN95bUP%l 8zaUu%Eu Dc\ .אu{xK ?G(!V5cmU%s:;&xhslg2 .Y/h7C0%hÔlpC'3DW"Hr4VOa$445OIpƕnB :R^ RdM2y-Il@k&1X7IQa7]H~|{}k~\:)ި΂q!gDH޳Jb)) RkAY:jx根4cV3=ٗ=IP2 0Yx)GysF(b|IUW$TϢ(ִU.Vx" P{QAy85fDK?-`tg=k&Ύz5^3PޔͲ&R7 .:oUNƣ &g"(AZ}ppA9(OBI2Š KL-qxY5`b &\jFx;3Ⱥ#m LDRA.N:)J%X$TPvp4{1qː$xm]Nk^$ uҶTZL Z5@L!2iжvlJzo4"U>XBve)ת֐RuEVx0%r<[lqdN AVA=کNJf#xחvG'٬N7%]ɪ«_'5e0ǯYf LX.I56(ǜ(s[$9ΘdAH-9 JDɕ-yNe[xݮ \thod}4:Ř%J: _quBpf|u\]ߧAxuM}u{Tjt,7ZmgzT ~̾:jd[$9J#۾/W/_|'q]%t_=Sר5J[«Qu!:W^``9Փ̪o<J՝Ww^٫;{ug՝*D ArWUVArr$d)Ppӹy&[h *6V$Ș\cMI2#Lڞޥ)a[ dmBGfc&7V<HVm:hsH&E$K&Jf0M|ioQl_e kkڇ# 1NJٲ像RϜ%dZ2\–UsGf\d<; zvj=K^z|.2H3oFD8 [DqTN.ƒ4%tm2!dĞS2d$Cѱu&0I?5[j|>ui%Q匁94S,dQdz|ԧx=;h1' k>Zd8'"/@>jfAluDĝo_|k=n%&<~w2{RZ)՞(ZwwCԉb2 6W4V;8Jv#%ۄvJTJGAxuD $*ZpABt3/ESdʬ9J! 9 4t{|%"j- LY1,A!Qs'yr$sa;~ǴEcBs>}.OUxG. Lh>OSҁ 6F>Z/+"\>;^0 Dsv^!΢SVYqށO+abFfC"ĒCZ)Z9NzxP2PHpw A "zA@Yq6eL/'Ç .Ju+BWS}DF /WgxINq?3G֚ZA頸ᨬ>y_d7' "O73O?>VBOE<5)LؽskA$$?\D##Vy _waP٭ A?鞑赾lve M6cFt٬JRQ&ADQ3Y?$Wޤ+tvSq(dhP"E][v}^@*XBvėBwLJ3$S2;/L9%ZzzLgL`!l_7 2#+.oi?W4q˴c&4?X| RSdM7쮥 p>_UnV0/cJV8.DVtм}rMeT&n5lI[E!-m;ݒ%:RU~X~9Rg/ܛo߹fG{P9-6#& $bg|Ȥˤ` 4Z[g YsP[:tWoQNƒǪCN"3T~t]{r*̺{o8a]\%MNT%ΡKP uIcI++!8 H<{#ؔ1lN15CtnRP.tDRd%,RNz@bO!s F&r3qֳ[BB}),=[dCzjB8x#&4$bb*2;?m.?gJ2m|:x[*VM %(SU,U-AZiU$HVpL6G4Q)[N+s;+p9A?d0q<:0bI:<X\aŸMN<.Ubm( 3^j )XIzc-s"nwޞQ4oMKb|ӝh0}m,F;modߐ=fi5""%yPf7ҡnuH7#[tv5O 񑅤]^l+̨ sg?Jg1B D/8HE9ڴ3|-݅;"oO~O&]d>}hm;eD:Wh;q)ןo%ez.vRR4%h%Ld-LV58WZqfƯN^ }Ktx³]:/z.;[W:[^-Vqzvjboۅ7ɯp@-x8zA/?fcWZ+=*{;͕k޷JԲeK<>Cύ;3 jh|zFY/C7Φ] Ʃ$e&FJkfTҪdj֓|J*SՓ9|i3 6Ieq- hd;,n3EW\=/8dik J6SOvͯݲrqܞ9)aW>-gҸK7r0T:D('[+oa:0/Rm(z*J'D420olf^dET8I,"G|I𜓲 ^mzW6ކ}>XPX݋`>{>J-cwE}>] G҅Ý*DLFk=4$\ɶҜ9p=+;yw2*Ad+#Y#\'cr6^0MbRr)ie2z G2,F4/ IP8T^u 3q64j7w&?@on\t[Of&Q[q *\Fn6[욪UΧmxwvyWY"]s Z裝&1oȅ q`nf; YĿC`}\}Cׅ1ya<Spy IC=畯8i4}G|w1VZ] uSsE+_w8H$h]NdX~9:+}rlsmms>?cĭgw`2[""!xd+`95XΡ1GY$53Qi9%d[I"3*wIk/,]=tƇظyM`wl}~t.*vg IfkU`tIlP, lNKkSwAeoC@caEBu}m$c8Wek*:Ywg|+,gG_rrۦjYɲ řA]bp/u3QhRc%kȎry>/0UC3(Ku+j2lObsǝ m=Ӝ`JC2r8S*cfF <#h/}ASStAreMV 1h ͆$1P{ФCRq br<@H?{WGdJ_f;ic37fhw?tuJ[% f:Jeadf28Xn@i  0|j՝E^zWOf5y]l-x &rt6d(8cdK "5 tMO5X.QBVBǨ$g33']ITDl֪;#vNrWP-jCcԆ{Ռ9ZX)w[B˚\PQ$ t}(-P$$Vt-buG֐JZt5Dr82FsY$V B-fݹ{~ߺ bq[Dƈ"xKb/3rh]62KFStVgm!6.X >T@ypgfW_)&'ۏ?~͎en"[o1zO3b4`5(bTT)BKv16vf.&mQYR"kጎ蓉FYΤYd˒ȭ ~l鉹;F;ޫ=9L/} Yڔbuz|?vy<}aXW=w}m*Jzas^Y,IdEK(S$O4HcJ?3@'AxPW!>kh RXPTЧFi)MAJ`@N* _Ou':wSy'O4E-l:֭lJ)A l E+֢."%,+V' iܚx(*JBX4R0싴:;+*rv!Db Ƅ&T@RUEb a9[h7XW#/7*kl<`Y 1_~hZFQ:um=x*4X'I9sy7= 5B/ϝ05@z5pӢ勾w'o=x}oŒLgORMR)7f [? ~xOUr'}hG0-w\NP}!oRm٩kڗ^}s` v.#>d9q|pf㢯i?Y;dL\u<OG17w]]|SH mDWpM|[ \ 5UF \~t2mnz|v_2~G^דݟS\?,79\_n-AGe R5S '}bTT4rE(-l& BEוAUJzc;A"vړNEcAڮH(EU4^@sEakҜOv&՜edn-@2[]6v>͜ƝKWKxE ^TNݓթ6-ox#*}*XtTA#;dUUBo5ܡd(WO1&0:L5xTυhz-d}uvbQK9(H|> 쐧 &I8r)%"'f2Vy#0ŬE3%UlF;.aMUhK$^%M*W{χD,P!WC5z4z%{ykUoo]jaTz ђd gPT 6:`Xd`~!ktrc`| sBՠ`ϧc[LqX-cj;јֽn`LMOڑ43I.転lW=6m.pE6_'mb5-=+=R [ ڦ̴9.lLtdb.EZStFtV{/n$oQ*aMGmǵF/;#/*?h MFܮ.xMH>_>icdaa2G3f>F]yk=w){r]s'V ݬsgXO3q1U6p {_@TO u;=^?6)trk!vd 9"O(-)%<+q sT@ v"%Ml؍CWr\ˠ/ґ@q `#JnbZ"`$c6$+2*ms1`lI &8otibUw$.qy*2w]\ձ!SF&*g*|;k>{fduI@ouA B|2 ]6=O7]b?yY2zSc{=6,G?,W?Ag~Mjuv 5'|G ,j^['Z{:84=z4g\tߟހ\r#iaCM3B]-8 %Y3L_?9mS) m\r2cѓB)YTd] H)'pM[7PI)`q%:$GZ +55NԺOlQi1t|d /{s\cMJ/j,EF.s>Jcrd-V VMjTE T~ &kArInǬfv_rzQP4S,b`eB',ފb j V[bS9 ,D D@bWH9йHs_;Yl)g}D;a$BHΫT M5Xl( x3Ct! X5ACQ`TO8_#Ka `v&c.n6Cn+toz2 k `I7VEa~=r+7|pF[n> '|oӧ^ mI;KcWh;t QK^`@#jA݈}_&Bd A#J$dEGbKuN&t(! {;ɸܶs?w.>2z6FtTDjNٜRMO3Z~̶u؜%vtA[%uY;ްGw),R˹[1}1I$ilF C22D4d Hd)k&h NݴSFEӘc%L˂e}w*E'KC(76Ζg1h6_|q`xjKZrom+SW~yiϨ?QъQo, 6z 2sɧ bRi$$$a)<3F3 ½ B˻y=÷ auX1 >*"Eof 1H޻hEJ @V`v)g~bYf%j_ZY]5S812$ug!1.Gf'٣'ӛ-<6d-M=WjaW<s""J%@y`"XyRx&[(J!Mv63lgZ{ȱ_efv|@0ۛA;4:=݁,9>U%YMYSn$i(-}tҐ Gro-0E<  B );c`uPM%d7NH QpR8'y 2&D;/YTenJ_+"X!MB݋fa{Y/m&jsa< jbԌY'3 щ"XЈ̉rCTD@ARu`|vC"pLm:o-vWa|=ċ"^iS] oyÕiwq2Xvw$Z#Q`ε;67fwO:ϭ-gm9cbV,!ΰMNy߸&0ohJ8]{%st$ QuRO/.S8!*X@iuPI[͍FI[#`U\9fC8-C)xݍυZ qS0' U.] Dq1VJQel Ѽ!.%$0kkU.Ԅp1^1i}Sp|ts?S , K9Ziy+@>c"GV@ rH9+Es:n΢#ɾ4"wJ@iՓ̙\rԣL*RG WDb&x&RCJvEG#VĪtg7dn_<5w_ӝxrl)F΀wȋ N(уQ8-9急i015]Hv͞{2q'58֨ؐ 6!IBwH-t";|3͖.lxvcӮC{EO塝_l;'sjmN`48fOG!9 ۢM=^F;sw[;~p(ְs Q'>Pk=N\0O8?GIkۭS:8)霃M^Ԓ)Q|\oc>XܞXE>{>Tj33=z71˫~}_|߽o]/PE[|.p8J6w;<\Pgŕf=K 7QJI oqAH>^>j.K@fd*#3w,@']2dG@*ONjOmipo69WCR'SY撎ƨdb@Aήo3THY"!hFHce]L-5Bu]|Q&д*yٌn2匧omg-:ܐI9Zln3[2v޾M;tH7$%ZњYfsslɡǧl;0;M\[WMikˤr4$L8ݮ;=4.y\a4ozusvo1mo v`܎]깧6Mwvz@M~.dh!~[nݞ~< Fg[9_K2!'V{Y1h{{g@tIȺ$dL{ TORA4g#$r;TdP O΀$,xB(BR:kr4U7:MZ>Tjr֖ߦ@yf㧧˓a$V(]R4 89\6r%xIR4*p*&{CerQPL9bE4[&H]PKfLAT-٬Ky l͵g_r\nS4j,dY $o9g㤒eI%K+;d)%tN*ߠ\c'aII3>㊩Y:pb* <!똱.U7RcáGds/XU*}<&˴379q~p!&N}߿q6y>dђ ϒ ygLĴ9V0 :2Pc1eeBHGC4_yp7qJ+Ӛ/_J*OGuݹ!+6O{vኢĴ!zKHrE O !9C0 ۲Ȩ@S=x:$X1@JTO6!HFblF,e/XXlf< Ua,T 'W`cER▎oτ~ӠK|rg#9YMW)43%3IL(J>&5\PJ͈톃0g\ jEm]u]7e9'Rg22DK2G M:MJw$D!Az$vD !hE{5bM+FWHBx4=^k~Z$ bPDˆ"vvbO%)"!<9.h =0#SdsD`IV6ݧJm(&)ԪKg\pDTQQ#x4[wߜ8[:xwP{fɃpRM.Ml>wLAʅT@ %7QFOJN)hȄIx4 ;\. CZgwa"gtgE_ơ^YkѕT9O I0DGx۩ ? A|qw#sZ'ΘH 萤BK*W1 DR¼i:#+@\ZS#e.4΂9Z!2i4G2 ~1qĜzlUN$¥G~<UIuKOkRKQhkY^;r:XŒVsF YMYQdtDqEмt1m9$x󹹎T5u8iW#݈L#$/40]W9*H`45"ZT!}P8qO 6?uIѰ?,eUV֭ז0-MI %Dsk$i?d†BneO4mք։YIL%`IŐrEG*YJXb-0:E*HBaOF=rH vɪ(A@#r3&r[Ljl7I[*6z0?<~?=oO(7 >WXǎ2ۻ|G {}%+ogOWbgٱctr/4Ы3OM˷u&goy o̎b8ۨvPKUoId\{KBfM8s+ˡ5?1~C^[TAjG˅Ղ9w~VQ{?Yˇfs}߇׿ξw>_߾POQ5(Fݹo7]zIϟw4#z/Ƹp,{Mbe7o^ :zjymn0R1hH߱qf7Jl_Sw,˦ǐ@c,=~4}Ke}ʸitHIr8;SbȻ0>uA&ˠpxxyǺ}[JD[ uB,[ *=pM'<-Qj/{Aӯ7:/B.dzIsWWPjgo<>>:VZ+mq_6}]*F Qx+&Nu롳[^yQ[zQ񢺱ƫãbW\QV $XUU]_U1ߔ^=j8"ka5ӑdw6F#AK?Oʄq,NDid$(D#pՆk@I 1"Isx7$ m No[!Xර{bl7w{bwqӢAoɕ&~z8,t,ǀX>f8L-rH2WW%|:h̍ל39H }NjEmJ%r& @99RػUS(xY+I8R)" OPiy|p`O"k#g+4Ի*dA[:M*{O9l!{)d]nF+Ѿ${աF=wC-@!~9 >fzu!h&m:flcc#XמuI+p13n23*i~8ٍ\ϟ$NZ枦$nT vX%q%qdWg7Jҏ0H5k#֌{{=R^۲ݎn<smbid6m|Bzu 2lo*th$XtHm5L<^/"2Y: nP=O,͂7^O&Zs:5qp$NDS/s?leGt<ı%Z,)a {;KYB:Ox <'wy;Ox)'uy;Oxk'?OPy·r+:O.KzqhደU`3*(FhX 3BM2F G'+m;鶰m4cmiWгȹ-lr=&4I% `f({-Fu@'RJE$:{x\_!GFOx̥SQ2/"DpE=btZ!FϕB.2K%I*xFa*jlij;~ZuQl¶`=]OOWqNFCx3 mZn @jPvu3i{Ngm)s7f!lm)4)rp6pATv͝w>p =fI %A7{aGO5~*Zcw 1PZ&<T&m\UCf %'1N^vYbwk%(a .&"E&*7"(I.1Wbj4k更}r5cJnPU3Cj˨L,_RpZ!i ᨲs:DW2=EPJ4C2ziGj߲\b=$+k.@[+.Ab ,ԾeU9xqy](W7!՘b%Y>,7#{}CoM&a*`4qFa B>0m(3phR;ƽAL낿n W]_puӆ|Y`{s>h~5[I(}\qW̌*9TWxTч2 >3=7QEL T1vQ)EtɃ{}UY/7buv8dPE+)Vh-sPfV#~QNZ,@B}jdfr}É_{$Q]ۨ*H"gC`,kɺ<\LU.SF@{Y>ԕٜR%/!R\izr9`1nd\IHՖ"F-Ida,4ed¥b)diG4`Dd]r#!d " -ɚ GQA)Ā@5Pۅ\a Ke"fyDt%$b'7Z$<+)slJb4'uFR*4 >hȖ ]U"z3V6F$W%D.r&NAíDKe+-ַӱV#~i#u%{EqXe;ōA ޗQKc.g0nT\KSy¤d'C.<;<<|VQFݦ.! 9V/_hJl 7%[FQEf%I]j=j|p<@x q|hm ̺M &"i˝NfEEbhZ 5TuIUm2Ͻ2f(!>̵LF3%}(Ue&Β ,'b.#cB& r%H*1JR҈ԎLtѫ)7WP;z|6.F<oO5)v m7hb7$gbk"UJl ʶqSo8zzp3ok'}j|GG>i٦3+k72}r6jqgBb%3= 0XI?̄dJ !Е=e4-yv 4ңh1; s%.njv"KK[>xumԹx}su07 j=ofQz>Yez(*)sъ>p>)CDq*lZU\jGW\!'9dQ[pNTB{H6l3K; UWCZ< CL41%+FrydP d)pVGiR=J∊I>bDbXIj?b҅Jɺbϰ4h>ޢiVUtT*Ҿ9Oy3Ar& rSD]X `a@AjHA[c3 >9sLNyƍ;+:X3,\8 xè<^ߗE-XCfy,=?`:EP !'B9E8Ub^4ë7T)•{AXt204zWRI %kcN j`| U)%=0r4& Qa.E@Y!j3ͥy*CGjbIm]ӣ^Q?^LU5hBU_oOrr|=t2WSqskT4uMiɛ eNٜ"m<5\<= =,N0%퉚|*)U/w"bFY'.@o\3aGAƦ9m<5;|oqe[Xs~\ VCͩifE$Zɦ aPnLOӓݢ5v=-JkGOT!qemh9%:[t3:{t<(< vuDvqcuήͨ͢Tީ1 h`(˦~%.=C95tӥk{α}?k/F3-g}1(bgN1/,;YʺgdS1"єS bhv=d Ɂ؛yI ~R?: Uhs]b4*j*YHfV-F\:_LPjLEȽ-,cK yԈ YQgr-N1OWח|[-Jazq޻xuAv6uDѡNTgWRʄbR5LIgQTDwB!c_Jf}m.ϯqfH豴sDEJH:$a%S%ɩ %ϥOdUt1А5NK:$+s}n`J8l)Wۑn/Ka՗gr[lvȈݗiپ="yuZojeY|._9(bm,Ȣgg!RMcRAkJCnO๼Pl< yxI6Y}h K’,qV_h Oit=6M8޿0r|ַÖisLT(I$+AJrq} FbGwsJz7_%}j%g=C#C-R[ٝv#N|^zOp٬Go^9WTk0&6oubA'-`]Ȉl>5kB|@Vy6D2jk0 <#;dZWo/H8 o˛bp;f{q1mܐܘLGa@-$z3zebn&u= R;#/6M}[nzgbdL9,yWli%v X%~{%wH r}6K)߽|7ƒ Rɐڹj"#%H h|a!LAI'v˙=\^oxL NADʂv:6 SfeIz4s*cDz]@gӬ<͍'Ξ[֑NPj ]~r9! 2I WSq8iZ=w%pN|"_wzy& rϓsg7ܡՆ*O$lPnJ<8@Wf>ҏ'5ժ䱞?~gk&VR.1ajHG 䂭`[ sCNI-uV tL5DTbK 90u9 i76𙛀VoqUiӡ$%uI̭iط812bNpYp4vtCyW/?䨲>BjS&N^sK G,9 YH Y*_Xkoӹ&4ZPURè .U.;ח3s={*)mls4K{k}Y$K+U%D(|AEް f$B0Q['yY-y'o {,y}B Xu4}SBnHFw8 f՛B&MhEV6MO O d# IC1y!wmH?)t{V{ހF+6... ci{K2Q.-/lF> ,*\SYt![w{Tv0bZ:;ZO,EneA-JŲ Dd+pÕ~WJ?׆pÕ~p+pÕ~Õ~WJ?\+pÕ~WJ?\+pÕ~ҏMJ?\+pÕ~WJ?\GO{?WY,߯> IS⫳E..AɃaD* Z#`gGⲳr,_*]MbNeAHǘdhL5rOlmt8:.fKQH;n \6l͍.I)o|P{ng=Rڎ-Z|t MַF'A1_bz=! s2'-\p"R5w'o$xGwmpXka?TcP9wD}3Xka4}w49S{]`XrS'!a`lkl%Vr PF kPCB``+ BS@}ȎFs^8e:w{nmm3;xlNUކ._?mCvwiqbG:P'TF}2eApTʂ$Z?cQGYЛ) Bbꦭv)^ .CWOFQ)d&gY%DRp{6##+[JY%ːe-#g-~"JT6#ɪc;KwCz![7D[DPj/Ȁ3;ڣ?زD;e\̈́6BAs ?رaPJ b 6{˞(D]6I+: !>=(k *LLkKmS1%UVP`Z%H6K>$;3nx-6mgK[IշtրfԶT{ qu\Pm6\12UK_^ݬv|u|n}[tUlDhG\Kߙmܺm`3(z7!{o^v?pD\r6w]O{鹾 X?].ֈλnbܼs;#h}9a  kth*rwZ<ăoZ^u SV'MH$=9'R0Wȩ $5kXn^JUE#H""LC9q>1'%^]ƾmfCi|.SCL3۔o>~~~XcMuFtVM2!VkT"W͏?M Τ'L|%=5j+*1c6G!֠$g;gppg<qEA|dk(]qMq+%늡,_LY 綀L}n h?δ>W*WS ~֔ @G.RvrċՓ/_| Oͯ/NEb2vM֮7ZL6̥lBSlt>[K !PcGvȞ3yGEXsqLӐ R-V,UP zS>0ᾍE>i/c&NMֹB$ : sF^Ozv*5\`ḵnԏ[^?ׯ~+N+*&bRT^H9GtXr)l*Ǿ1p)Z=pE [mzjƌ.dJ!Z|Ad-؏tn3X;cXxT,W|wc[|o_<smyxqq|y[W&_` "DrE_)h Ck' oх֬k{e\`=$%*YyΕUػeW8;].96;ڡ3jڃd*u"UtIJe"BɜObm>-aaC E;BfATt5֔7"(%_+*!.a~<\~ƹ bTDΈ"Ų"W!rlM)Ibq5Z ]2ebj|Ձ^0p&^[1gQ [8ٻ6dWɽ]%@M`CbuMZR#!E(1h`T?tuuթDC 'W)|u5KEqX˸=.x/]  TN<גs`Z( q1p+xyx8x"^Uk 7~T)zj5J4~wנGÄFǰ.\>wF f%'1CeSbQSbܡԱIؘ-Hi{tg6ѭaqz ژji!@0‹Ls&BNX6Fs4޷RrE,u0(1 aN>6&iiOS)p< 橕u-S1aupY -sFI1 p ݨ LagZ zU,\(FFp#F\h4S"s)WtdYǜcH*6$S.ly%ފO&>;"3b)y%QM*^(͓K'jM1~jrߦq_~}O~O p^UT!¿?}7ELZ͛UՎA^ޟnW1 jq%1>^܊7U(_w/킻^u|) h~i%&ŭTjwkL孢d9;yDύլY5R_ZW,;::xTVIn[^z>/U"A)5=՟XP1{˛s?F??zSM&i||7 tvk뾹Vj_ۯ/No`W7؅>zl!\q`s;Hw'ds R3b:z{ɾn8aPWOIG_vhL磣o"CмXzFٝASFxJ;$6F%.O. }^E;u<X Fs»M`꺭W5R&y+m/cb -ؚ]}}V*`Jd LY/.>WaZʹyz:fZj'ӟk\]ФEf2]%ޣn%Zu,iE"hH׉躀>2E*ҌqLh0^:c KRo==]kUam-vOszA72Ѻ;jPh'A:}񵖯f5hTJc}ߙ:'UUU`X{Bm"~ϵcܓN/mrbmkqz&햽 ׄKFY]RHf1LQ)m AlPާӋoӰʢքUk7UC?߆E"WθB"a0q ff Wwn42 U>ؠoءCzק_SbvV|J/،өƻ7+|o='QkSoAE{OjZÚ1ԳB{N϶IU1}e9T.F47fc@{jkoUn6Mq?\?6s2߷`Ie2bHB̂F#u>םpڪvUG,eɦR̀I$mDeW6 ӐY9b9ȡTry>r0pHh2BG唎>00N9Zm%.w;m ypH p~3»45'(5g)_Hܟz]&{j hvA= (/ (/.Dv<) Eʋ\0[:lX3 }@ [I~A5Fʑ0] rIA ;D 9oûTō ޒo9GL!W)$Yl.چRLJ^s[ܚ1_zvhXOѳ䕇3Ȭ&#y@&d,+"x-ˎU=s,B4% 2R8(녱Ld%P@1dbd=d45qԳȴt3ɡZs.UjE#xV+,>r A*Ц | 駦B(L})<'#!'sBLcYY'LKr! dAf  Ptʄ60V1'h4=pmmN|%|ME x-HΘeCҢ KY&GHUqp/^l< ?4Г(_kSd[-+U:dX:ޣ+'j4*Oz 樬bHDTHL ۫G{170:hmle0 z9) kpC? nng| !Qb Zh&[Nt] u(J4&-4dKIS>2L;Od dG T`"jmLy=yɎ%;Uq$eI ?:G.҇ܿ&T{Xfo2%#HDacH|J}+i\q5 \i:\#\!7ZKjx%.U.<sdZ~`gת}"-vD{|K>Fڽ"fWEZiXW ,9&WE/`WEHpER޺zpeA)Ov6E\WEZĮIp J$\ \q稒Lu#\\اJX(HZyH)UW_ \5^`( 'ONM'I L< \m&e׬+\z.wb/m$3veŻ ڍZIG0]6vo`ktyaL0 aZ`I#*"q5H+pUTUJ )RMmIo.M|t~9hR't/|Ic4xzEmِn~OS0M ökuNE?KxR.6{Pr_$d!/vaX_ }l:󿟢>a99'Xh8^̰h0HC^wCcJRloV"SVo$-Eʮe~+YIZ]ki9 :"ˤ?],htД>A@t|ǔΫ1[)2q]"}K& A2Efs,r%J2_Gy'LsMtx{i; h9Z߾;uVWgr޹'wB<@]ј 6~!ͥn{l/E=q89lF0a >qQL*P#oxR8e8ůtNj_iٹWCůtKův4$Y`A̕0À2e5 eL. K40ZV'\H]RdYkoE̊QJAzN ! -X&h~5q$CtZmXȽW&6bk>>.>2cRwQ?@Hzbt9,rmbbf5Xիj8u|@Ә?UW SmFk,km#GE`1N6;`Y$m%Hr.~-YRtLيj]lV} TgoA<K[kD/d:mFa:[e<;{z1bCrXp,V$|} Zc5Zc1~*_#9n!cY3(o}4R!J48LjhwD-bFBqătn@0-d1qj|@<8NYŜ`b>k.{xK}(z`À| ~a80A~mop&٢qx_;^f˟L,e,dLBG,QAQ(>PƅL\"2/^;`mwƉDYϔҒuHc!Z%D.E B;&h`Za6h͙DLIP<՞n,AiGu1rv!zlhW'6Gy؁#O}3ExjoeQÏ3ߎq?bJ)Yg HF zc^ ,E7EO Gxh< lyؠP`T3LQGABkxH2EyTvSy`biOA_##7/A5&KBRfmڐuX, cRAD(bRX?%'".dv_␕]DI)*mnq`~w TijwXw0^ Wޑޭz{ >m|?R>fBt٦#d+䫋8>ˠ@0JhH9pVUօTL;Փ9<;'eI"ݙ+{S:u;9*s<"|0Hh}omP?$څȜ7~Iۤv-zN4ϒ^_/BGx{ԅ">]ao Okuym|5 wWM[oѧ+Bӷ 'hoZk0yg=rqankAYkmrmJmpz|lK,wهçTLکnQ-x8vciGPwpqs]]mxTs$9?_C_MH_O-ϳ54")o_nzXZ~Z}$E.XL%UheqZZoaFri:(UI F8 qR]N(:J5(X7;n|R9vCOpqÜ+!NI;Åh/KxG0VDv&1(8b,@RTu)F ˀlyH3k yO~G 4wlZ3cͮ{Bjt.'P>݇}uhE2DXTycmG\rёrjAykeFAe*OJ@e2 $Io!:($qɍHD@#-UQ-ՈS[ORd=˽*%-I!I aSy(6)&"pa$"2:|sx8e= CLJTNT J#c1rv#c9]CPBa#}7Y3ސ\3ie[P`iПL8bӘ !#uҡɄhrJAѾ"D@)ĢFKIebrÓǐ=\a+ Jx%f6 ec6 D@ˆ]݈Gð`b jCQ[Fmݡvnx@8DRRVF#ђs4IuIIy/ğ裉hTځ7h;!hE&D G(d FuԮ;bڨeg`<D,")Cz84(;"*' T;Wfe 8IT4ae-[C QZm&R`)**y4hI3AFbQr2V{,'| \d/)Vw e{L4 1)`R@pJN!BMV1pTw$3 &Tݶ .JH~F8>[j67Q{l*ǨS2,lL-Yt 'cv}G1=WnV'x(,DbIz* Cy| ibtVpQ4LjM]1,^Ƃ89G;D&4KBi9{FkZ|g LI*}L;dU n&9pPg=s7qՍ4bA u *CX!MF!*.IwWy "۾#>Vº @;´8,HM'x K2$sF*(Qe?ѴWZ' +!(6D%J!.rTrs RYA: x+<\T c'x+\*'w2)6gmκr}m{=__q/߼:_>ӛs뿽>BnN'cw)xN>?w&o/+18~AaPjFvs=^ve^q ap~oz.)m&٭{˦7^}`dn]r=?gἡ>^+T}ŋRw׊j{\ 95}tǿG,< _}B]67__wdr/Wl& h *U_[Ψ_ ~uf ~>joDZnoy9\AY;۽~۬n0E+p/^hۼx"~y'YC\%V_ʴZ'%(}o(ki4I㮧b|E@~ sorEC?{֑  'Kuw`ff1f_AW[L$%YË.%j|6΍}NWUU]],@M&1SWﺺ{ZW&=Ƃm|F ]_]s/Ϯ1a,{pbJg_F_ok]߿zW:?~G^Bϕ]Y/Ʃ~c2xnz9zi7׫;<\5ٿ`W-Ig\LY%HXIajT1N3϶y_2jyW?1qsVS1 o(vEDa >,d1iV/iZj4;h%;\v'.uөT^U݁,dQ9C{h~<{'7ʫRb;mAv6Gh[ qȫj_n<"nlDCtrYlx#rdO0b)OHZ8HFS5(Q8FFzG%̆X5*9)ٚpNYW瓳H ,H)Ȩ )h%)lх*bcIAj{542 )WtJё*l<1e0dc I U1Ѱ>U6f%PbƤ x|H*tB*'_L- dўs;ձf}:6*dʂ!,HE*x]r&nб#ұivd@~еoV;: xw0h(졘u8ڕq䗫q?NƋ=wefaO'{^GKЋBT\WS^4O]-᪎yÎaƎL:ol  2v(^r֝qPrQlk_l|N\- mvD% 10(gE5#j6l+}nƒy>3݌8Ќmc|^؁bbjNyA~Q,!I[wr^ ƅ.C):4Zvc*JI2 xuOqqgkc+[Ec\$"!fS̩ IYD`tz 0گML+Jk6Af4LȠIB&v݃r.[ a(DŽw$Y(|}n\;}wu]ܛHW9qMgr:3ͱk& f7oXapqjrVu@9w?Ea٭ ?N_6`{6~/i5iu{ikyΆP# 0pC7owKze>Pa3X!~sw7k%|?yY[Q5F1Pryz˟G]_jVgG|b:-V˷T~baGf=\78~82ʖ*ooWldxNs9Ѥu6;,h`}%PyхP̹&yhm2aeWx .P\66H)Eg{m .Zd%c`1k)d|I)2.غh<6]}^_N |CN'$/zEQ<`)%v)8Ǻ*%h1XCh R OƢ@Asijwig' D0> gXw(7$tBFp^U{b0W$eI,\P U@Z =ڐUOֳfRώ=AGSmL)ђA.BnQ)$ nu9 4 Tc|/ vԞJMM| 12I*,ZCOA8?OzBzm / cV$.VZ),$7%RZ;(08ɑ8*~f2JQ7CxHܝҎ|ARa@$-%yoHF8zT)."cڵ'zv,LDvU>XV>pљ*u@ *- =6,DH Yhk0cюfva Ŭ.џKnykDh\%$ ɌJfԄNnS7D 1UZ."*o2)fyJ%ژ '`sz>ߏ.j%iU/TVLJf׈RF!! BzYt ^_}NbARV>`%݅{։BgUb\ޠ<+u>Gs|CGwG{{4ya<<ހjRdz{8yъZvO).Ѷe糶-W@l-P`b>.@YJHu Heb1.u,M(GѽKʔ h첄A!Q! .〩^(X/]5}|LmML8kQXTdց:]4PtBuFvg=gmW<⭻c[/lf=8,Hoٮ-VTjv}:۪w@y{(ZtOpv_Fk%[vY*~^^Ç9rVP)<ٮjdց)Ri]YAe:KLFAFΛ\M 86Xq^OnP݉{i4 F(fR.;}6ċ|I&%j+$EAʰkeW,Ao X3@ZH¥M񓍂~~f|yCn2ѿd~*%DM߻ A\? o W;vƼȘΛ P:{u!hR ~Sssgϑ 00aYBAPE,#tK2D,$Pƽ%2x8[vfD?"N"|_ YoV>/Nğ>lG『HinLɝGBpCaeשּׁZe2iG8:,8rzqҨeL5DiV{#cq,<O)*^J8eSpҁ08b8ƜVq zm]]1L}ôާZ[#,)@xp* CpcaDVa] Mw^C|i~icsrs1'w9=jAC@*Z<َ|*]50Hpso J!{ b7BQ]_lu(>764õ4謹Ey{Qмl&;}|ϟy*uEǯa )+L{g|dW¡gKYl J* 5wGx"/P1;,8;CPbK& )IP!h2[rИcQnKr9[S vn٥ͩ)HdH\$ cE[~5͆Cy>lV#؃Ә~9I݌1BE8#!B9E#Z.ʠ -)E-H AZրB1_){eXIF#ְpcbV٢Ļi_#Ű4O+~96d6qZKω[ӷ+29﷝66v?&hoȓKHijh"ᵕab9/5R\9jA$꾭eJJ%Qu-b*Z*6=9bDR-+}m1Vl-Ҁ/lZ[fy2*Z괒]ma:pp=n󋊌7TZ,Nᴇ;rggg+2 0St7]&vE2Ԟ_ 3F*bj3Tg'%t`?WU%iTP%"pa2N\cͤcW[m7X'&EZ #e kd4yKb |05!1C cwdΎmM޵q$2?&`,ܞ \bIE,;W=ç$R8({HgUuu Z4GQUdd\guzy(E0D>n+cD%<ݴHXN^8kňݣ*'TȆnCْX 8)fM\ \䌏> $472aI 0ҙb}:ug -'8̵WήVrQ X.^.rqcD'r J#Ü&kp+92h ^.ޅ\yuV.Ἶ;aFW529M=Zk8nEx\@ޏ>O5 fN;g5uMP(B60z-t {~L}c&宙u>S~=G;t`fn[k܂oVthbVQ@0`‹Lq&BN&F稽ՁiB.%. MH6wKI:wLBYˁ-#dBȆ'_ @0cT)y@̒I-tU(QbK2Mt*uqF*O'wW|u>燯^|]B H>z3ݔx j\ ;Ih'V$o7m$i{bV*ytcOviUc?SN~5ݼj>$ld9`J*Pۦy*!}YiD3A :,v/ߦmO$u4L iijs^F%P_`Hb0HIA [[vh`0n]r0af2; n>CٽZy*å&,^"hg*Q7̭p<>.|Zu"DoD[ MXkjY_`\#\R@i6Qr:K/xT6=\R/x2|͖j̉BOFerxU-g_,4YQ&DVi,AJI?̄d*x,y\cKݽce v=/0%M2xi=]i%k<+g$jM-ur4Ig{s+q, \rw׸S]tqW^TEԋrgGFcMPjFя?5Qɦ'NBC$F霰wS54@,T[nj6Bi1)&x-, §kw+& ^#ƺQ&uW3G{; HY3$V{ykWoǒJ$j!RU$t]>3M*8&M4gr=ⱓJ(=}i22h^ͫAL;7ey\Ԏ[LG9VQf Hsd\aMn[Y+=!#>抐du| EZ%H2ZP2JH A]H,IvAZG(%+M1e:C*CS ad:p RƁs\]]ySOh?B';]s&C{>o.AtyJrķ T :iӣk6*8ɐrDL¯}QgáUI){ yV@;&}%Fih`+1 ) Fsic8큆+.cޔѼjj1|43o6U v4+9MkK;mڬNߧFGqoi+Fq,JA8Y[c !\sI9fA(n/3^$Y"m0ɐ22h(xg\#e%nCDqS2xY;\zp mUwpɍ]Ç؞cW`KpA! `Gk*݌қMt^暨2G~l6RɚJ"$o_T)1L[ԼTr3=뛽|64UjV~Mų;\9iΆ?+IRz}2§w~[n Q:͉caؑw:X1XL\Mz kirP@~e|tV{#!h4XB<JDOK*̐MR:d"pΌ Fuڣn=n7A#19vJ^v/ K5+ ,+഑7tꩳlƙd>#Nܑh95Xpɕh/Gf); f1ī[tR CE,ٌ&ʄqAJ,Ơs;ܚPQ|}SʦY1.& #lyLV 1/JJ*||XvvЖç0AXzNɠ>n;Ygْ/B{sDU9 ڊP!(= TDrg,H Eճ@,iwBN?t%cf_"O{Q8 b$ Ni-#hAk0)2&\= ja'z=,% 0.$aYYs9VTt(&0FlNXG'cXuF_I`oeI%acX*X&[YEɹuN.&P6qt(c깭;?LOh/ܖT7KMp b[WNhiOf¡!XZ $٣;zwF!1woz6)kpnbTnþ ('3䁜Mx=w6Sw&~&!QnpO?[a 5;JwGxl ZoM;A/"5R]šӢ%˶{&vYn&z1vB[G Siנz>jU[`&gBVmJݵI+|@$h<]\UIQQً[‹IEX}u''oB*w!w#Be&L/tHg\_2Z#ӄIz!&霹,#e\+i-y1TnRW9?n0z;wwwDm}n>)5,Ӷ{@Pqw$MNO"RAKÉlA ՖYlѱt|(HV}XUve jQ2D.` 䢥 ɽN4s"w,KpAu3q,0 >fӏM c< W >>{;:fsJ7%;b{"_[4B(Tʓ601R"~'xiV X #“ BK÷ ђ ]RZv<>;G`[d 1Zam ih)7dRBW$)Ya WlwA M=a@8[2)p;gf%3[њ:*vptKTkAM.:3Ol~ <Ԉ(R,Bh0eg!*Q*Xށ $yKRHxCO'Ґ{J ?\"b62E2%ˠ3&ꑧ Tɍ[=}όoCZi\~ `rE"Q*D޲8ҬB0$ &ȠNfV`}/:mG P}B=F~4_jՀWn ,al!ׯ(oSyk ҼJϧ"ANj0b~ %2VF !( z! ȦQ\[@Wm履ot뺺Eq,qìk)z>=;UWw٩Gc G>w2G'ZcuhTaq}ǣI|ˀ'_vQ>v$2Ӵ!JΥ@ÓL"`4FN #x e vgdHM*7ԯ+#2 yS@Ϡr}:تOia~9?;6эTceZ_(옘9s'amƩS^z{fJ_ot#]+lt"r;O,Rj>u Z_I g󥚸__gW_aۭBp_fnϲQwH`THjv!tr.d2! S)ɲwB bga34zw7<]/Fe^Zuo-K4XBBt9׃^Ыp 5ӍGVI2 r!/{:K7z_u.Z)8rIU`S.kKҒrI,/b%KGBKWip0qA/KgJ-$j<5_Nb"dkym å|PkYUVbȫRCu'OeX;*_٧[zEަw<*UqA'Wf=*' zrZitN^F8Z8:pUIrzuWjSA$ (^_8c:=f-`*W_0uN=ʳVy #<:/F+K[5:hl␝\ږhO|\yqm.ܣJ7QkS+w#<%FZ Rfg=LKѢTcp6Ftl sL']u] !H)27@VK?e 6@ފ{t{a& y؜yY~ :CSWưGkMz%ǻ|ŢLGw׵/onWmsRJPmUJ 1z 1$R%2g h :]{mf{vv\D\ $aK:d$+dDEPUpzμ(^[`Y:A59%)ea0FUj眭C Y O5gK;[է`f{pw(jliȔBQxTT9'ǞdhJlT& jqKC>8A7on_8,)JB S>H$ eL- `}+eƁ̄ZͨV<~R']~_ڀ+5YD)zה[iIV$\JC%%58zt3xhz) xVҖ|ARi@IZJ#n;n _om;lgyXqBc46u-̣dIB _ߨ *g;E H%AԀH Ye{hgWdlgb4ցCF?>^1#n2\4txI<=I<DX$ ;J؁L`g`q0`B:pF4͏2ev?E;I7 Cdd2箱d∿nJ%o 7UZWH͍_t~ׯ_5td+z򓿳COSRubE$E \AD(ƔQ(6RpDoD Bx y|;FV̮]Zplt~޳qO­~ A _Fz6_ӕ~(YD8L:lxu00IJՠmAmֹ" 3gƤ)+-YK!*KIwѦ4/sZX6IkH*`ItN=  gK2%)Ё0٧O*zsJƒD0;d]v*:Mtn֐QN dց6]4Ptr@BDzpGS^X2<{ҵ<] *->^{/j6-v3}&To'x6S9{׀%  9Jˡw1%$z24 !]BG !le_༗DYmNvDĬm"@\aBQ ɤ%\deHp? J)$`R¡d uiLC2WO#xl}1!6M4K8.^bLShq c*%$1=_}&h7ji2_UeFU5{DU^PՔ]F#}YHPHYz$PAjP%FT\b)UHQIc-LT,lA?2֥ufOUzHr+w3 <_>{ Z&GO1bo7ȷoٕlns''(ܿsQ //="4uVf &96G{%0I0TMB1,C)ym(`:Aʂ).$&:YTʃޱ$. y_5&Ĺ\/h|ͦ/ 8^xnsdzuφގRc ꎿom4!}O|-F (!I="< "T]޿d=0|;-%aSP!#oqEː0VSS`ƘcIZ"IZ Vhb iMrpqHG?ӑ^̙-ANOjw;TkW_pǠ&\l6JyGPEEh F",B!J;;oA [xtv~z鄞y^PB6"{BC0gKR TӮGHdtD=4AJ}W?q뢇/୾uH+OT.h]$J[6G@FdrIV?ћ=1^6-hA*O֞(/ 7! ^ x^}a@T$u"Nde?̀`QJm`;҇_}¬4-N=O| DPtvYC_$~ɵ8B F,ʃ984^(1_/:[t<&[~sR3n]W(2u˴ӳ[u~zygz8_eL" :X#C@#%x+j?,QI[SS^Z6(MmQ^9ulh[GG׷򥜯8}`a֪h3daM"43a_uꍣ8:qdH(/]*:snd0:A U! *@h8)g뫊thر*uE3\9??Kx|6XSa& d)Q,Aޑ/ Qhmp#?#v5аۻ8he,YK(g}cBٕt:r?ήZiٵPiή_bgW- ` ɉ+oH^@h5RبMCM C&L}1U7eF[V\)u=yYY}]װgܵ.775Y]hͶGJ{m^@݄6VdQ j WF͙3А.&ˢm6Q1z%-ȡ7YI;0(nsQa͆CWo&LWzIzD^xj`xMGMYGV(ǵFDbb3AXlJ^1ԭLx r!X҃S2:_*G࢔MBQ*  friԳB k^rtSf{S,=xdD " 0jỶѐ.SļEh+/:}02Yʌ=a͢ 2(넝8+ Y-I4R lqϕ C^,-ͱu60a5Lx^$mI"FN=Z T=>? ښv$NY(FŽO>s-Ll0M Yrn.1TW)L2H[;n" *Uzo#,$W!kã`ֻD4.>2l~'!׳lϮgp}Gg_o.˟ӏ w߿|w=K$ PP&m, 0i5?{}%?W1#|[m8|'y(UvI-տ#US^>,maޝ򕿞<O\1퍋;(G{[36xߝ2w it>pcriK)Ŕۻ,[%{r 4~Z"]K yi.]8˯Sܛoe_Ϯ8,Vs7+qBk͓} \g=;`vaW>!rzqz_M߆Kop8{dNW7X|mhӂ{^_пXpab:_\̿Z-C.cIY4[}VI㶧bc~`7%J~Q]-[nT .x5ٷe= vm7n"bI'jkUJg5;AlR{ppůx22 i;jϙǃ^j?F9Q~,O #ĭX1yfRAfQ+GYi/Q%цSPT^1DzۣEpCFzMT>fǂa.6C`cFROX>xumqԟ,U9YTj<_yp"eOz̪ϜvUX8fɢ|H&X/'C,'DRt=Ȝ2Z$&+}nU, w d^s\ ^Q:Ǟ.?و9ɀϜ^S:/T6Kt1qH%TZx#*$F4Hb ifYЖ@rⓊ":aK6N܆*^~פroj0ї.cHj{#2⇨ c+[mR8&0i1`{됌J]oC^!_ĝ+p3whcݥK ⋲U7 F8'XlbaO ǼR~ &W`&ˀnfm|,ʬ@;(e\;)sN*BX 悐uY)YXZٔ:&l Kw"F"0 k,vjl7k.U`fZ o'oڈu=ߞ\U㑺H{HL|>}~(H5TMi H!'7ssQg{gCgY,@8iAIt(!Gm%6BKuld^r]7Z"DpE=b˘ REf)y$BT/(Le 9;:Ԧ30H>M8Mvsoڵ}<{kkʇ-]ayMFBVdb얩]xw9驭}Hxp]lB16O]f}r֛4ѓtq[&Rb:n~ idC#[wv}2퓷Uvyaf~ ]~X.hȖ ]U#z3V6F$'D.z&NAíDKd+KcjWFv[WI/g)|uⒽ`^^JA(Cǥ135jK7*D%)J L @zz+wyM}8*b8}7яG~IIVST D'8Q C3MHA[6D+鳝''}5cjI. d. U IqwmI_!e;V?e 8ds^`Yí!SRL2IɏW=|T4E5%Z-NsW`4 PE q&hKL"ُtL-( sxpHNȒjlD2Λ*uIe,vfPE;!x61F&zcW9GS6}2KɥU2$RWsn5mLqpgGYmٺ A!(${`3*8EqqXW3Dz+-FF P  !bt(-,0B)OgْŸyX[[6*]>b P匁94Pʑ6JD&g31>p Y& \ A~ 8f0q+1TBz3eb-2h E4ƂDZՠV {;؃d$.$,ȬĂ(Ev*5qMU4.'Ҟ1LVN۵G:oIL0mҠ K ,mVQr),%n`c깭Gh$T>j[dWSO-o=bimGWNhiOfYCs,jĔ0={c;lj;dW'ܰ;4U_H8e@BlD&(4ȽgdDS|g4w͂TM·B` FS|"$ MV8PS}孇4._j=fQJ >NV.ëR6{[%-o,κk/RV,Jyr^J D g_ d5Q![Qynb[ %@]@=\p*PjvWb[Ӡhjzi f4h$cj!`4ɭ^]K?_ލGjQlBُ8fC2?Ѕ5r 6/ uhR=zHeFHtLôSZEvc6'>TNP4Cn 0lHKM;>Lչ^tUЦ UN-ӄ$=Fx:g hJ"Z#sqqV{TNfY,:,כ뇂Ӷ={"5P$OΦvmk|GodsJ{P7ΆCܞJӛu.7Pn9^g`L%Ju&"hd,XpYkd'0r[:y>rUNmBRE>}`V;p`)jk:Eݧ |Hv^JWh:%bxYQ*xp6Rt 2#5R jߖo{mycI+FXИÀ0Ko rx6KTQQZu,rAKe1PA%~!-]к9wCU͆~&y v#iI<6Μ}eG(hE \nb>TQK\׮ST&YEGM6 6W4V;s}ԃol_T>@|?ed<~Or97*g^w^ ~<->wvDhQ%(U7Z`}y*&A0Ƅ'P HmA1#ٝxF h=a/ë; oG0;mLa BE";O+abU dC"!1#"B+'<7/PW-}Yw:զ,唐|8ݢ.ܮ#qm wSDWnye.e73< 3ZkC ¤J%@e }!BxCXA$͛猃4- Gمxp`tx%(h|pFq8X̾ޓ /GA,6躷[Rӟ__fNRƒ7ypZ]*pS_ِH+JƦ>ȓdm20T0<#JFyîn|&z^3c7970o8sNPs2^iݵfA|'4 ہRsWu2<~fXwe؁.ֺPK(noܷTW{4 r/n._Ywts} 5ſZ4-;&fJPt$,̷0[(Zsk; MrO.vͺgJc6ZV혠ŨMW71q;]ܱ|ضzk[4loqSUf,"jxzǮޏ-Ui>;9]GPw{1IbF.Jc_rHIڀ6ȣ1k't2U/So@v#-d$B0{awr>debzn$iȝ>jmzU^ف!4PرkP!q˪8Tҍ}g?Ub}u\&xW|F_SObY⸨]ߑ /C{Q|?雋LSQޏ'F{'1(bF58Qٞw2aTrʐO3⋖2$Q֞&Y7Y%bmW]m"*}POOt&] ̼;Wt瓼.aUeRPlXHz5X._-ښ,jyu0/Kߙ]LFӹ}Eσb J1k;z981T/@JR DӳGKx=/ʅZdiB !7ٲ%-ü L ۴FC;ݰrn}DA)웵&G IfY)5<^)on!2D7ϪFZ}[񽾆{s@mac=ްuQT0NG뚢݀ZV=z"eK{jBH?l·ndcj 0IqfuhH24N؀dz,mL@Xcgl/ v݃6RB^@9ufcZ]I(@z^dRt}YEY+~ٛ.vLJsgB%J.刌xTHJ\dA:Jt6vrs5rl07[uy+KvKo>jS~%n%pfP;x0i1lx")B-QOl/'쭸ǰꮙmOux0v2,GLFF"#SY#*s޻%>He 앢0 MYK 묗R%MPԆړ PV#gKr%M(DG3DZ{1 ]BJEOwܾ\J˲|wAJgumF#Sѩd MppQ<MWgFK\s}wbz^V5 a_ny$#,թ0$ rci-"pn##FHZVAw_~l7mܓ4JL,M.S36?`FɰQn {=e$ ~P謄J@h S(T6Ik k1bJFS%sΎh+l|.#R&0 OY$g@=Jɓ1%$9GpYiuPj6DAM-/2 vil4<gi2os1p2 Dд_, /4> |.-rs\`깓ႄ)ɶ49mId#J}cKa|O/f~X7=P^橛x,z<,~G۴ڛwJG;?[1zl8{Ey17?X촗_J'zu4{7?{FŸv-H.{Eq}F?-hRKJ {gI|)QhiT?UřQnNVͭ7wϫ5hz{߾Cc{ٗ>U'y'դ yAyu?̙7=zJ9aӀUL? G_72qq:-6b7+^|D?֎f|g|یnmnoOOuӜ~S/-mOOg߭2-!Hq , e`))i4K(R(Ϋ>.8>̽I^!?  F]`j>a!E`+[h+[FVkjuurgl\hޖ,7ѰU0I9G#lܿ70^pWY>vdxƚZ%$!IJ `;EKĪyiPq+S+>/NDI'V1 CQ2&qwD\SN4yqD{]VQ YI *A3ntI?fՆŃ`&E>OA ' % 'AZe2(|@RNtDs)h@WBTj˔΍)D'Ldg1G|ǢE ?t R9׻"H)G7#!x["5UiA G d}I94U9I'Z%k1$(ˎTWLXLЈ0 FR '" Zo\'N;.qVuGuSwMo5.7u'2R֛_\Ieq3P0e7Z[*gH?Q'ywojP{v)a[rOTWf!؉>O$pD7uWLȨDԲʐGZ嘟[MP݉9], }wrHGsud>9,Ico69O%)Lja z˩mR$vLq`.x#DB0*CtAa'X- j 6||S{0\ mm(,5e'U[\-:U]_5Reb7>DnO]\}⬭D%m2籶.V6 Bν e":u=ܺjZs8͆}~6e-vnۺ[r~Fk>]oy=?W7tl$X,sΨg[s:imV@lKB>jFB!~y05O>LjwQ?\^|Os@w-|\5z9dmj<#5WTE\Z^K/嫉| 早w> dESK"6n+OTyQic(3wejn.X D:00&B"Nsdd K %ȋ%s#Ÿq`+lln[|s4mhP^w F$Dܿ(0E {S4p;/ee@$52i|=bE4[mMr\Ҟ"T d-eUTz|i+ٳее\h,_D0{bCA}Shޓp- (V&iUfeJC=fdRJEM&Qn"72\G!ٌn4 *池vݱ/j¨:$ص6HH ,uxHҩ`lZOtQ\pRY4ꎨ!dhϢB xIQiǘ BxTk>^Cx4K}QFD!bkX {ʖc19gIA5"E >)BEQD4ng`D Ӕq52JP,jSΞX.#g3"~9qq2 ;*[g].SB pŵ>-EgJ50.&:%/(Pɭ4R=*.}JᬿV0Q:ǧ~4HßQ0y6a\NKpVcTJڅ0`#6v#X'5ij'j)nT#Zv+վKX\Z LzՏv+WmY d-I6#Dp`@,7]6M&ϱY?O:v8EϿPR>#F5Q3 Lgj=vTrwTh3+$il ?2sp\}p{^z)ڿ.zWpכW_G)O޼<~㾟 Ÿ>⻓Ql=h#praN4t0Թ ʮ {ʕ?R{1OlwOӀV(;*mjwn\fUڿ հ9yk&fفIhշE75BͽcdCn {&mx`ĉ+&my{o8 ۄ혷1&IӁY:ܾbST,g"JX:Ǿ֏vGiꯪ7tQwي!$ tMĀȴ78 mzc{7k{I$MRv U S= XfOuwUu!( NKPm{Gu\O5ϿsMUڱ0?dV,ndsc]ZS؈h(W K⇼c)_n<ʅGϫq񔢔ҍgOlecN%=xp(IݳԻܥb .C?֐~Lၱkt9c6([8S@׏{lEk{#,@/Eݚ]C\幭. Qkm$CIZD(n9X\ҵڭRJ'5~ՠ1R ؀" >gjFF{Eaᑖl;R2iUAS<"zh\CH;jRX|FDh݅ҚU1oΓգ.U﷏^E1לeup٨x0!zF U1!AjTdS๶gu{5,,ZIn66IX4Lﱙo~Ja;p5V2,v /d@hCf38CB;9/U“g) ?6;³yZ - &ϾҎm~Lmktڰ1ư&w1{eY5LR6eyM*}9\xΌ善c )Ht'9lJnF:ke-e` 0GM,F47V"FMsFV;|x<r ڽg\n)ىnrDU'`׼T·U`5F-smFrEp)$m<=vo^pjj+_XR^-fSI\9:Lj8 %\腵^W>Q D Oc*Ja*,<HA'g5{kgMaM(:js%HA!j{^<V^^r+&UӣS*K.8{tSyF(AGkĽAIx>f!*0,61 ZSoC\Aspp>AHwp"DǤ5:Pfi%1@!Ayj?QPz, ChU٪C:ɝpĊsE#Z Lj, Do(.` +U4}iu5CHn )#1. !@0sGD|L-ƈh0ʠ%M:D3E\aaA$F#c< |TV3Vm W]KvpY2+}v̮[]wB7o.ϯp՛KL_\wRH6E6) ?8pȋz Qņ톏Į/;h$Iwl)SU}b\.:hW RU~.M)ƥT^2$\|9Bwo{l*o9 -'5ww9j )ԭ5&|xs;|篓?')3 z(//Ϯ{нvLZwh0h s람+Qd޾HgX9ArlVB ~t|;vYȭ8LdPvh˳磳|[5~U /gEӝZ_(lχ,.cY[ܚ {saTq΁X>83[|ͽI1H,h%mݱ }pZYMW#(55%Σ{"H`@JuąBRPĝe+IuʱFS۷cdqf#S+vAu ˴ k]\FȍJ,A<ƳTg騺Ԇv5wvƿtq8SNaDsĪ&x C cnF l5]TT`3p@AT$k‰j Ma8Ѱm7NuId@$i&J%H2{EVE}&Fi$*95O9$AiㅑJ vCFDZH1WOpB_mU}{iƍ1`1F*bG;-U350N46TzG"wHl:mR @q&eUD1*GQ25 D4vFv}B1]k0ѷNcSNQ!>X00]㴍Da"KDFfȆƎROH~o~Ӛƿ9 8ԘdLmt&ޞuL[NbdNE%.g{_c;T@ASRJ0x\1 >mXPcY,vTbԆkWDT}ٓ~|c :+8ƄxJ)ZЅ Ej N{F &+b _*:Oo@J 7,Dzڣ˓۲~~K7;6GXVa3X@B=a$ VGI zM%<+GW A|N 7{}_@]k*K)|\T%ceN~TAÕVL9QQkE=|oͿcL vY%G՝fvmlf!'g'sqRUpo$FՂ>x"USRkN,P%h1:_-z:iv*fݽ0W'Oo,b4*Ocɛ(?Rj)_U>*wdUY0{N߄z#Ij"h"լKB[`}"4Lp+UGȃ#y3wȝ豊wq eGˎwYЎѠurIQh- D 1R)ELG*eZ30{-#]* i8Q8kFNTl]yK Bu)"B:_V["@vopj vM<_ca@bdSWS;bNص[Z=[ȥ&a$y$}i;0zs]?˧lVL:]ń02yOl{gf ^=y^c汑n/otyh| |5)bJuۡirjK9X`A_ ]>9fvT5kOt[c6>Xp<։&Se{ )},ZIw(Mo{ *Z)Ѹ+JL}>oN FymE3xƹsH*F7ycjp^kkjdGݍ(%%eو}3;AW9;b|q1 ] U*_JL A\jL(!u@QdpGP=_O!/I#1BE8#!B9E=,2BzJ%CmKf e ()LYK٫(cJ6XZƊL5ϖGWȳQqǐ$<90Ii$x=?Z*EZ #e kk~ÞWA( t./ !c'ncwdΎ&eb1Z&W>bTg[o7i<\:5lrƣ)|j~q@]lebUd&ZA$sF'Ed &'iR}l@K**9PLH!"dt:c.r%*M4"\d~Ձqq9WyTkfA$Dc\\|ΆW:(cQ`)Hd(- X$0AdrcfP<-g+@XbS~,nI}TiȺ¦FEcIl:HwXv8]9!?yDPg?yh-#tⲱg@J(:m@M]p"+90KJ?Z%0:SX7"H)2.ӒCGc1U}sM}g/vb Q='sT-Q|VHPsRR9CbL%2t6.VDkq6QϧW9O1;v!XȈ΋yQvJ+K>gPߝP)α06ւs Y8{R~|5T |d壩mL)ђunZgT 2 @T \5P<#eP.#5̂Ek1H xLՌjx? x'=4YXidXHZoJe-RPEmhzucZy,1_aɲǑ{(eM-e!,:d&/[dfSX&2AzdVXn z6~Oj4Z=x !TK]\ɂUw6?%wZWz7\N7Ue](ϳlg^_¦͚t3n0|Kn'ZÌ.Y^tTDjqjc> Zڨ÷;4D0ԝK{ހS鲄A)DTpy(I8U>ق39j7Z9'u)N" 5.iY/Y30jw~t;Wtf^]?~l,o ?Pwz^{]|1~|7wGҀ%3.t 9cO(-;1%$2Jݑ3Gw64wD9]NI@Eڬ]""+#`pRYQR\+JkTAf4u3"&y <@[2Jdui렕8Oq VrY-σˏ4yE{kN`"X- Ƿw&1_}/N*ԩoUKQʾK}g)OX hp4|6JgdIr  ; VFEP}(~Sl`g!"AYS$ R kRF˶kv|N\qD-P, ޢBpW9.uV<"r9Ŝ^O-]j}K!cQGKFIG ml6M<{"'Q:%.3m J`ZM) a&bYR4z+۞Wt>Y #(@.ZbHLur**{dI0 z_5&ٳ,]ͦ߾/'xzry>^-6ʺ19['C_M?Jo?号KBBVTtF^/kYJZ x' 0 « BǗ&q#Ť0y `H$ $p5/C&pBGtP)H2)AHRrN+ʩ5^4u鿦cufR`7wz^ծ93xGy'Xbxn}G5rT֊##*r :9C -Ė -?pHl?;#LJ!4X0ctR!Il6%\l Jow ǤH~E?ZJYg7 O R  n;+켞Y|vVԥu2")FoSZ0ocl0(HG6a: )g%DZ=I /̆7ja*]{)eɚ) ;_Á5`ξ sr/pN#V<4^E(-a H'q-O7(~_۟@_߫SOy=4}-ukm\ƹvF8^ /Ol̯?C712~9秗M`ղQ8'F2}}®v?|3k#9➞5ӿNӖk9Wu-E}3yA;1Xb,ϝ42 _Cjz :nf:@>zLP_j(ᕹwRGG&݌jm]| ) Dz+XqMXϮc|o=3CPʬt~]wX٦P0=7n ,B=-*$Iy<Z *g[hz-k475hF]ӻ{dzqحp ^_kGhQR_}Ӄv@_W*xF}Ӡs-|ǫ_FJ(>xfE?{qt_'9|˴`_mDíO&'FW^]7|edT+OljYnF!?L"*.U1'¶l( DN "_řgCv~]~rI:}uB9-I/7r9CQs! ZQT-"Ș*ɾ[A/%7Nd97YgLM,>\ͷ*-8߅o}5.\]0o[9d-b 14ո] 2<9*[Fy'YYfѳ9vj0xz=(1fi|5&(oc8ϋ["[lT ֡O#JWA&npn^ܨ9 >OwNo^g^;Q=7{ M>"-Q%Ƈ|2VNn.da2!/Q0*կ?&޳q#WIdw:|z|ڲeIƯ=b4-GCy48&d=YŪpCP?[vhw{gFns|1oyc? ‘HX~-E5CZ+ua{ҴHZQk+}q|!@l!:JUyYjLQRW:}T׎}x])Fdwd%\~A'\T Vŵrև(S`;KIOkfkŰjהYV?HHp d1CF" 1;ϴQ*pcSqb9kq }5b~@i.mn9Q0A+n8 ?\ߏ X^!J-+A##{H"'K@>ޕ xwVl?Vl <BYZ`R<$ǁi&b{B JIMR ,T2Q-@aǘ{ʹ5??;$3XB٪0WEz"{ 2̥krGύ{]᜖ж3֮!!qvPp?X8YԳ%y0A5e5%=ȃ8#%D|RcΛN`nV]m&wHAC5*řLQTJxb*'̡QtjHsnH'DBux^/.)vq:(c\h_YS/=.k^Ҽ7romKh{5=2{_|[H.ߕԌ ΉiԔ AqYvK%R}"^ Y.\ހX悷H=Y6t Fͻ}VqGi ʓk9~˵/JPQ~wl6.GODuJ)Fuq+U/~[v֚h4F҂T9D,(@ܒVpXgyJ9ʛU(f# P%t"Дjh2SjwQwSz ũX]kkK&ztq9HXq‘s3&S`{ɠ-if9L%^2wͫi̎y8__\ų헋g? .ŋ_ H >(8G1ޢ7)׏[r0mN͛8?Ga7[.Z%=?Y.p~~v~RGZpql/=Z_WF)" csQN38^yc6ʫ&E_F,>zOM8 QJ5Fo֠Rum׎[Z:d=lyk}1.#?^^n^G.Pbqx콼~Tޞ7%* CE![VV]ӐH,MJߚ ݀V.aܵ-+%]KyG^gs/re)icu0FcgοM능pmh _#VU5e=eko4q~ ej7˵~5I>4QOHsqWOuS޳T`\rZiF(F QxR=1yaHD iksf d}#Qc<82b䊛ȽSNXEAcx JBN(o MjJQS8yxH2OQHfsQ<.04!z~ ud$Q>+5@ |sXH0dX 6@uF8PMWi,J.YD6Jb,'#Ekq Ր c AK';"9>I Z{;*ZӢ0佸AԈUγ2'V7>3[17QFY፪%rֱ$/-S۔]:rnux뙕DMDxV"'%#C%pl.D@1JEem=7$/ HJQ>1@Ǵ҄ҥ7[ZBG3=uS:j9ܮ+.tTS\7.[;52\O<>wJH@in zm}̠ZCsUbUP-E0aeadz(癁$Pdc;ɪD6etNwrE㿊/۬;&/tQ"QbFBQ%r`JQT oϏXbI2#1y&.ԙW0T=u)*M6۴i~Q [:3U}wM^V;]o2}2NiOo|^RpF]ߐۡ'AxK uBt颌'LBDžC{{/i8k.+i~'B9х87 HB}RBBɌ>]{ >T<,“Y^B%<y8ܦ@!)0W8+}#' ]] 8R/x}rRsxRhK |C]5=vNe/:Fo3THY"!(GPYíNiT@bt1@'_f◇S tkB:2;ø>s`Lz&w=_}m\ٻ@nm:bd^muL O9gsu=ܺjZos8͎}}5e-vv^gJZnLJloov9wyԶ]ҸOwt<;nzUjΧ?oJH{ئZm-cעXC"w^qlB4KhBG@oP?K*kNlwdBV A ;CD8v8wKy]{ TORAhDHwH2 = IXݙW`Z$b\@PxІ;Ŭ`QEXMa۽8mOF_~;j(Ɠק]OfݴٶZU}K4mo:C)vw$B蒢Q$x@DS(lDg08;0(ek뤀$O!JO7( iJ9ƣqNM'gpig}4邰*QV|ah6y,L"ZRY2T9b"ێJpB0 :2PH+g%(6B HG5$i-GH4V"|aB-nB'u TKd=Ó=z' \@C[U~\Ck I[B$+4aS>(p9C0 (mYΨ@v)˙=x:$ZRVǠR)lB&U3C>bJ1^X 3兪0/T'^xP^'x1kg 5/;6?{W؍d0Q",V2AnovvPmkXV-JHV:[ź<<,ӓ˫u cAP)o|BȄPQF$"6(>q>tqXReĹv RUVHD햵^Z7EG%ӈF/=v23j Tsx!,\٘B!TM=x|㿖[lτJ5 EXS)"h,?\ n~&<8O*g_6qo:Υ"b. SN7zvL}70jfs]Kc ˆgG >p&\"k>҉eU4"~zװxySW{siNh33.hda.[$>D{/6߼LE!A <}ٴcW<9_f,>_UcIpW~QI_ 敏zj}#'ֆ#'%]Gt].=3̞1֗c (_rmi{ %8ŋt>Vv{^d_i_ilyS;eRMKjLR3&$ɺ]%5 -RQyֈH"Ր=#tinq<7Uk6V"ɞ3 Sll ݧ2Ka'RG6Z56~8|S|V7rZFlHs XZʎli2P hYM߃ςvVis,Tx]M xRQJMgI&۱#w Y,Xg8skyꈱD L(vـ=&1ܜIfVvW`f+E=?–Jxw-&ԖX2P m,JIk?!NF?&>|z ~e6Ҟ E2b46*:ֲPa,#DN@Tk6UNq1N5=zu0SiVMq=P ,{K5LbmڛJqeaf҉a_Nf[툴B#&Sl։}2Y5v_9-y8Xl/~H9:3eU"Et;q{ջdzXÛ\R;RO/I>giyli8)fSr1`jTPrioM &VW CUIj)9sZOg&:2辞=h۞J͝o\\qK@ojn^/=|VH%_Jxש;U2|zt6CGѮ'aed8kgm`H.O Z4YF{a˷E A{>s|т)oHRՖg @R8. g .=+qw˟N>|5I_պ+QѸW|o<\6bt}V%~T,-h*ϱ[Zʺ<%Y:򒓆^e~ᕝ}Ym6^q܆JLJ0R iFVbBJp9JoS/I z.6KuO*:C9N klR UV 6;yFVC;( w'5+CO{2g {r-s]msz ܼ~kkswH\.pů?֦ex?!#^lA|P(GK֨d D>ed6_٣SFd/z" ǔHuP61MZ'L)4'U9Cs!LsٓJqI1S\ɏ4t4WNk.&)܇l%yk\ާ;^pcl9lFwlRJ~N;ĵt(p5ewRn'ہ+D:!N;čî ~V9/!%#oHAmA`p epiy{e )}<շWU SEm*oohfl1h2qP6Ag2{ӕ*zdAl6]JEcߝwNWF6n/q `yrX҇TN~Yu1L8>TF=~Ol{yEk]e\z43Tfr_^uBIܣ9UN'aZ&p?usKğWS>/(pC61r_ҙ%˱ǒ>F!&Vߥ[oK  u~<9q9cz-,dM*!{J;&0U\;?,jsL9S5 a )Lz%\mv +qSC* ֎ddT VMBŚ[%"Pawɦn2I&ϭq;cB줴VZKACm41G8:ׂPbnd1J.atS*r9(Ք.}ٺ ݛAd.I&fLWMB=ն.=%B ƌe6cv_r FąiQ{) un\"<efeYNmubT 4[cF`8giAlv,ܱT@fO !\V]},%9bևz " B@'fFA}]}8MS![hDK XkgC̙SBqsRVUHL˝{ΙWRUN!RNP_ S4ΉEwnД(5)6% sm-,%,H/&MI-s kB.@5jkcdh8f| qRv:[7'=Q 7_,TI 搭&4#IzH䀘Yh N k@]0,.;h{jKm.5SV Lc& eƻ`4mCnsPR),Ko1 xW  @$p-ipO.68ɬjD9qnՃAW}ȡ"@cqtq il0e 2yv=`k.[heb A!F32i`b)ͫް֞Ǣ*X-Ѐ2*Ϥ5..?SZŨ\zL (j9'd'5RĆ $cq$ „6Dա\tNa8AaEeiR >2UoU{C[3#LA@X'@4DbԫC fE,RLs̾XS(X͈[s6   c6Ksa 7RF!B4p v9r % ia=*A:</ %`U؛k,S1F )M  <̲BKw/u4~ =匀T*db';7YC9<(,!s ;G%x}B ؿ\Iv3TcǬ99p1&@Tambʠv0'"Bu>&`?,˫vxy sP /]"@-Xx :p4,}^wcW2EP{LV*4 <".'ȴ^|"ЋA9ћI >2<uq̖[ O²`Ł$Ȓӗme`-!ۀ8JßQdAX?5 y)tUGVPۆhˆZa(ThVAO]r}X.Of Ş0l3ol&Јh߁]~z b8AbC8*#E./;0P1 laҬ^gۢ"Zή$cwޕ6rdɿB4K`adžFUe[c5)T3xE*EYb0&눼"efdZK^: Bk[M` N5XXV11 Р tdj1v7EoPuvJ36RJC B-bw7qV%p9)yXD*TNc D=kc}8ƤMk|ӥ4+it9̤ij5$Y[Eh*5)NZzK"(/wU,e4vfH$tY#>hC^$41dSN6k6(܃G׸ i/ilל(2-m4 Bzdqf;t$LCBt^0{" fj?88PZE7RGmF֭:DFp^KtY]ѐ۩Fl1&Ұ-w*3v{iD/٢=TAA@jPFA/mAz |"2PʝvQ0l]k rV+ &eYǘ:i'g @1]NyoQU@ #nhB) Z i(Cð{=ڇgcP` d.*GDF4MEAͩ X{uY E^ f)R@ɷALs&(fD ZUm s@׺ܜNK ]TPkxZI1wIwdmiÛ  wRNàeB [EW#W֊D?@S6`1GNcO\ (D IYPFn25E!fކ`NlմOfg "rhcCGg 5IIfBw) R[zipoiLQ~ ͮhw& *AA(ZE _NZf6[ nUSMeT!S[M}@pի(eЙ.UAPǬxӲޅ_M$>\`n0F0lIB/gMp?&8;t wem! d@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nu!0|Vw'JWh(b'!:e`'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vp@. h '[ (=; |dA;;6/^_vjjjivo;jfjexΫg| #!~Rm+V=KppPEq Y=1Ă *T9tEpu1tEhG M_`nՂ^[<]UVΞLLS*ג['V}-?\Kr VRᶉ'~ 䪒w?>>*y%B>y=M}HL RxsL}Ow4" «B7]+sm51tUVS_&m')  ]!t!e1Un 1]7n-ظP ]F[uΎtuteJDW8CW˿DW֪t銶)*$CWׇRӕUVtutTXr*Q ]ZNW!ҕŠA\-g8vJy*h-;]\-K+BG?Z!UEWؕCW33HhAB9pW3]NWR<8MWfN-n\%ڸؕLW)z:^`C]nt%0]$]al52{wb1GVXe% mMwOl/-l8m R?#ߧOJsϔT,_%eUU1]n9~~ZKi҉t6gtj=eZhi;QiUN]E]j1I%¹d?InjB[6Mթ*E" A6G:ZI& ɺm JWeiʴ"Vm" 59*j\-9vJ;șBۜJWE9Z=v"gb >؂*FAY ]Ne<uVЕz`ѫ#`キN1{zP9t]ƨ棳N<H vdd8L[K)1QX>C`*e9\#?k2ARΖۢ![ dJO.wp@J_J Gh{ G(@9-ж `mM+c)tEh;]V""?Q/ȕΥ^OD}ٴuٝفKMM1g] eU6uJDdb1Ŭkn[Q}y>|w>s./,S⫽8fYoB]Iy}NmN]&veS4N4wlhj:Bx(5cWd-,$m[rtlM2iEpmUFg8]ow(%Gn}-=wqR_wOTq?Azn$4kK8?Jw?5O"<"/7-Wp궽ڏAwTiYT.=U|Ցt HW݈JMB Vr5y@}>}={P/Q0K1;1WWmvWT-g*PM.6UAg[*tUNޛS7m-ߴ|#orGmeW6.R2A&εNPn:T3ֵuH)|l֮ TQ 4r#s\Mu Y(DlZ;0Q;8tDrr&[[S{_N; oP4iVStzrȚ~W /)ݏ~$iN4ƹQv|ur1rLQB~댗Zp.y#J$SNas+%=1[$GC%s&.#mdgQY:ox]h.:%w2md.D$qAcj>ig}h&/_E/c8q`Jjh"Y2eoMN>-V QtJ0Ƅ6 H1g <%CZhGGe.^Ho~,pE!&bEk([2xAB2|jUY#1ȣ` BrJaw-rR ANqDePM=҇` t)fJH>Ñx|Qې j%Mw-l膾&ALWnxd.1,@I ZXY ! U7'+7Ɠmq%Wzkk} lg(+?~Odٿ Gpv]3o$ou5ITϳ?fZݥr?zyV v[7:b>˓tp1}ѵfcu6*y@('ӗ7̋2S5qINι@xIe̦ ZrPO #p182;@ZW./úoF٤ZۄJִ?ˀ}-J4xV/f˘@uE2Y.^ۓ5 ;s~ϥbfyScXo9ޖmVkW3`}|mrIڠ-teHp_BFm6(: O fM[`>/>7ne~wOFػ٧w{'ҋ3\tܺQw7uvgKGx?f@ n7MMKWk^}c5v/HsMNg2v?7YiG{aAeZA`YnpyE"Qszwl4jt1&ԭY0lPj )oc4qӶ:_hݰsҾlFt~F{-.η/J^<~sIv]*+C/8Z- d/9ǐII鹍!&c4y\o2ChN*nq`FC]9ٮVg4 OVr i D}bgJla-(]=EMKѲ`UHg!x0rTSl{ơAJAN,~tYO{L.w)p@#w)C9ׇ첔^3! %n\H} 4ef 7YeEB k,6&ntKYЧ ]۲E?mJzΧO _y}`]UCg _ֱѮ67oLYjq8^6H1Q"=|L%w<*T|T&M#.-ȸ(N H./:snJ 2*1C B;959'V?$8G[|w@ScM?݉Ƨ.$,PfQ )8PIޕ9Lts:ۡנDѴ96l/7y3G ]ydʀ{rd3.ҬA FbY#Fy0 bd}&먱v"~ܨ?G_cgӮ/qEOX^l{LH;kSUB*(OjIsLh6^(DPD񉿆pÕa5\Hq;+ÁhGe%L M^5x/ HkЇtBI" kbwyOfև_]^typn=5h)B Fav:#0Q!]"AX0&J.>|Ԋs5rl:T ?-;ĩ~;o3qWB:). ,xԌ>S 7&Z(DCT?ayꑘ=֩|.j+!= K|G,GcUycvj ޯ2e;}q,6N>a][cGpRm&˄ 1@''2%}D]1[>O|LCjvR3+omOaHo)Lacc"{ Iq$k/_xyޔT?3=o'_' 1)JIz?xFJR/4#h%3LdmznL qg"-[0K\ek";[Hihton;_cӵ;ZZYq;r^]1板[d^_ F,^U{3snp=[^O|Y<-[:^wfV1Avyr֭jq3F:U(abkfZR\ &4J+\0}Zi, R9NzD9de‚2)Y*9xaT@ VE=S3?ZWIJluC֒-]l9_O1/Neqq39ØVGGY>鹤I#$V{49(ϰAaE*!c ~*H'DkeL&кPfТ(v<%g e>>QlG+HvqH[ޥ6>ZG@A[gBA<qi{,W\#ߥ:e$7$}I]sR.+ߥ: 9C洁drI`d6\o>#+ ؜-J6%Y8Е*GT"f^RqTƢ0`5qt_Al-tudTOom=:Qi^-ԕgA2n޿լN\uOȭ +_ D%AC.o{!E}ҺXl[wuv>n=zk\ߢ祖d0|+fÓ(#_,')vAt9)eѤЃIR/I tC1 HϭBAR٭kq+BywW1bN%7Y>T>򩯏sB<\boLBmF8oԟ h< >Ey,1qri e Q .G\!t`2@*AN{2J(]DKBap &3Q[~4//[F*'B2 2p S pGEЗ`%I8Cz h3)fA00 i@tv1yoǵ~fy?h8HƆ0O M*-sQoZ;檨[.ٙ93gI4#@phbBKڒpGMΚM0Kqǫ w&-+mJnpdǿ#}R `DRIQ͓ AU{t&ScmEs =S&ʅT&*wn+Z^Ύ~g-1;gյu֘˧N0 `Zݳ)#yǗ~:۵;˱MDThe4< >H$%,&Fh9*[H". ZSeD.4<8Axgˤb8-c^zkr ^ҴCl#*^) wY|^oqOj=w4ywtG\댌EXJpΨ%2:ŘHs`$B2DUG}^p[_{m/-2W1k61 ,AhttPD6V`TCH:BR\bB=6M.[l=VgUk aZ9,JMgx^ov,3RBO?CڟմVl"54Bk uT$,3{)wNeGI48g}Gj#m3'…5M"`pSI0L2%|9e0 L#fq$>wiFi&g_Wax>U_ɏ_Oߝ|2sN*BW\oƿ,,P!ZfI# |}v_A?.b1: foƏgg~לWvm_;Kd8ΓpVvmboo)pͽGٸޕno||'@M+9T n5[cΞðYOWM3Zտ$|߅+_Tv6p<E:{yu&\aZO7(Vݸ݋U-QU? pvYWNx151._~8nl=l(5M;gu//~پطҍW=p n0hh;U G-Kb$r <,[!iGqfҏ@pFK=[+%b3!obW7% u<@FɩwM0uYůźDބ`w^N x5r[3Ok`W 7;a?ףOd\:wojz'Çؼ j.YKLΡ_csK^&vW{鬒<5ZΕ{jjC$Q1b6XH90/`' D/"I^;zY=ɨ.{t< 4)qm+/φ8E 1K!J&,insgZYάjMl: m;G@Bc-fU븛XGE}Kg_-hC6\QV #hm9+\}8Qk)Oat(þÉn[N3Y$`ɀ ֤-srEfO)I?Q«`]TN[-4DJrL @'i %Y4ҥh)ےF{u: ç+=ȉZI)1nD2pR}l).'eOB>Te$Q!'5.g0x=">EbٹlY.K\la9UnݯL},I.YRKڃt]>7j "GZ,|0J:09Po/k bm-Go w3( XRUgL)KQ]_*WKgsQP bK}ԶJK$/W X3^ P./& O,*q2RZSue"+h)BwW>J0MAbpr9)WV UJX!Qs\m$X]o&Wj3jGf*=z*^tj.jySvիftVt9DE@YY(Uu69ir濑55$h+寷/i- U Q ]4T>b\Vpp4^M8eNpch<fQ,ʁ8XݵO't4.:<0jލ?ӾbK7/dNwUKo$n'ļNQlkZޜ4JQKZMj)` s2Y&|D%BVJ~JerԜs_ gUm]5#ֺBoa*N hPx4(WR<Tki=P==hZ˂p >}@%BC\ m,iZ,Me1BL]+^\Iù(ɺ^W{+eҪ \`mʱ@o+T)Tizjq-{) z]eŌ]Zɺ+T}ĕReA!(Rpj:PqְSH0'XW(·fJ;U/ +0g8(xÇxF00 _L|rvؤ&sBJXg4fcaVT/~H᠚Z\3MQPИi]dЮv/al6o:o/z#t}9X&^߬ q6jr1/Ϋw|]uC5Ayu G` 1l)]Y3Np"rp (,5?vԴ Vhsu~9l?'دφ NcGk{K`L=s/&ɡP0Ό^9 GL<^ȱϧܞ+j_ivo?[tz\95QB}F4fr7Sv4͸JݱiF&{Sm Wwt3Ulb S ,Kyb ʭ-=i%`+5e+[,[YkUֲ=nxAv ֢^j9!]P%eݰv'Jp% RS P-

J UI \Z3j?*E #PƔ3--'f bpj:u*# ֌Fn1cWVt޺BC\,;ӵ+qQ-W1u\JG#قpu95\ JAdWzׇͪD*Nw n&2-ߏj#(Wf*WlNJo+j۪`[jo6ГA&֧/_ E?fUFqR*jl8cXܵ _K&W> Kn"o1Osd6fyACIq1l&Y$B P-Y(=եJFKlIuqqEZϹytՏ =uoM󡃻906흳:bkAuS6,$3D5'+wv+-1;ks5an/F_]K!?px}.;Mb{c/Z6 ;Fض"6 j EꈖzUPg|6uާx1Wۯm&PZ+\Z^& Hyk1Q,=Zr0 &g4>G彳Kpp %>΀n_U[z=7Aͭ7Iw|r\4˛޻01iЛޠq{x@JU~!/NlPqVx7l JEV௣1!rplfc_*L. m>XW~)߷۶wònQAoy}|OB^o %V-Fa7nEN'@`cKqnh1zZfI߇ed_MP7-p<oS.wk)! ]klf؅kdWBl XGi~{xz?oh7mFW+L|_Xx&j_Ix7^߇,| ƟMϗ%nD㒹3= 0XI?̄dJ !S.ɨb~ݲ<wƅ9|Ժ &;Mqj%](!|̎\%&2ph'$-N-N+K,8O"XT(3[-NfdrRyO~I/Ys':OAqE>$G8 Ns)Er:`cnYssl2HcgȽ朹A:tR+fАh(3 X.B rҔ#GC^$lb<*d%TZx#*$F4 ^>yczwEifYЖ@rtEtfcEXĜp6N܆u֫cu  F7B #y9n3qܺhJ$1hb:fIǎHǦw8=Y~/kϿzl5AUZ ޹;pQy>Bb]lp/;pxS dU QT,ӟD1.qh5M8=|N7ց~=EOr>gF&ˀnfm٠YYvkA7vR搝T* eSWԚ bdd-n*4ivmUB㹞l?_msPf\_۴*Pn,e>z> ^x2kQ:{7rg1]MBwW]j}3j\HvH+*һ{*]ٓkwfS!d5ë5j^_"d@2GT<[;ޱhEE]\-nx7Eu=u9::s;9^YR{n9v\_ tc֙P˼Rt':J'L] T[b Ly"Mڸ$JN)RdH; ,t݅*(@ tV%w1QSd^Bx#D3\{g9Ocj2;Y4sOwFvيfnf<&S[f͎mߺlGcYP&`̆$ɯd{EPJ4C2zI ]\b=5|cv [+.AR ,tv'Džm{~.渙STr;81˗aYe30l\{] ޙeIX`b0xdt`! D(?`gGt< B҆ '$bfi1QH t)ZX֣=-w6]*Ruk[ȿxl˧F D>3=7QEL wT$A΢pC9k[y! PgG@ּ_ HJb9E9XQ;k~E_M018p ;GKK<2QEwdֳ!05 d]R<CT yhV H)e|JL0%]G *`b"R&`賳-N"+q'Ls ?= >ٹUc bv!1KLg\ee8\RsY`9;U-Y /ZK1Pqh1|g5@=kO@T}4FiuX`w,EʂE<֫>t& ;}R.U1ɿ{ρ~aO q'BdbXBz3}KddZdedR$Ie :ZZ0x҉'}CL45(< 7,DAX$CZ\BD3z-HsNx(kmH@CCM`w}Fi/0)ɢH$A!))QeMT?WUĎe4]-BxΤ%XB6ZceYUbL-XR(”ڮ~\sG)ٕxf mss:+kI'j0H2 F( v = yAƈ8{c;0vP]# >*MЫIR۝qJYUOZ2sHMلJޒss:!>w|R,a>7c̱r:ce˷e$"_jY"M]Ǥ?m_vwirͅ]|zV\ *,?Z _xO~oWIfy;uaX#On)W逰{1[,} * %@]ZB=UQl,,të E, )KĿ$zSo Wr޽ۦ.ÛA J*:[f+8K78fq3?B͚0|'jC5ZTi R1͈1 N56ң}T2<41' Œ9K˭beR[Lպ^.ڥLLryå T3cĜE NVFG WRëTNZzgyoR2SyT~(HE3A=١^%tAΟO^57ΔK]fs^X/e`ťQG4v`nO:w7M4yΘ2k DA$^t"P52QLYUD>whXFYGʽ3ø)a1w\l>n!Fuz9-_BzS&)N}O @-4..>^%ͷ&`~Y074:|Oh[x򋙼0r8}DCw)8<\Ap|UhvIXa |QQ&W x)Yȹ'TƇG?4 Tx i_@L%l6Uc{z/^ݰ0fJ64nޯmz *Əw3&=?s]MQAP]ğޛVcq;ߑ:A.ݧ^R-; X:e+A˚`'Ġb%\^j$iRbDeeﲈhqT(3"tF;%"62S!1H$SvJqTQ( Ȣ`;ZX+/B"IlY.$+# IA<)}jxK3 ,- >:,b91׃r1ïUs$>~<D1rt6KW-0k}ԁol1}ΌR: 侤18`ƽ`cCew{F%H⨝2aOy/1",DhA #(Dy$RD3llQyl+GJ)3mK,h:(ktqkƕOIW}e՗xKCd!D'6  3Zv'38?VWua/! oG085ByBhK\#o6\qLN/lrDQg|V%T8y Қ)0IY,-i?aظpzIi {e ywEijY9Ai|`M-\W.x !w$ q8Ds% &B9ycafO3O|'zZEO'y^^DQR薌Zygu X"AxI,'uiݟ޻$։G%Njׁ%*qcrR9}H`)fά4z,0g!1=?uZ#pH`hQE-s@DfK9xbꨰtdP`B3<M-RQ˰s>r#ioU(2ÑгN?\53<-5>h?MtY o4ҌT@$/!Cz'B1x?FE讨kt}Fwh}|zve߸|J{\WN 78ņ1{׸7p|Ɠ^\2,$RPK8{XswWXӿ y:kFl8w-Ⱦ,慪맋!pEBjB0Bx>gf|5 T-T 5ko&y-Ho!л:ݦuvl]lӺӾ0qpS5m h7p/5ĵnJx0:J-uZX$ښ%Ͷ-)ުSߴVsr|P%}%^kt ] &Efn3}z5ѕލ43ߛVM5/(&80YHDyV#I i J,0+Qi +,LcwXyJ 7вۻX鎶iaYBKV |7dP4ޛ}Ըdi&.ίLJJ-Ѣ2 Y$2&;U%]^Hwo mKc4-͋ܧ~|p9eZvH뙐C`?UN9A(%S\ >^g\1pp%b\2pKff`:7/29w`wDߑI RFEDQ0j$:GsǢ6qyfև=,tz1t|tE5DY`6*L^#QBU+cBZK#R 6.nR5_M9 { ]lʺp;?z]vJUfߛp9`:K^2H Xi"2IY{TN Os22+Xͺj9V@vlsV'>yqV>ټVlu쵲2a`"¼&BP5; UTsc%bD}vv.aǑK_zLT|\0(e+ qgY"ζ;G5}`6QbT;ӫ`R c@_zjlRbnT΃tch4_yJ$`dUEz)*QСJՉoQ\ID\U,K(>J{g'eB'S+"ohu@~{:#r3ہK!p;:?@mmXhG4M3dJd8C HE-$rj[^eŘQHzMSrG*$^9!IeJBo&fwhݝp%P: gfRDCKg T295PT>N>äގwxaGZ1E  *PAHS'hZK~3"(62,c"M1#C0!B*Caj3jX$"﵌tFs+%"{O6r[,,ǫm)ٮ,ˇ,x>^V}C^4aܻN50(dQŚ;tlGAH;d*ՁP{ NQeJj,6&Ґh@$ E9X ֋ {>w=M_Gl/4I/]4r2v4< CbE 'ַͬDFW»(K@lAfLA%DPjDí  0Ԫ97 *e}rD>Tl6O>JRd>HW"SX0T7)&(i=zYJ%JsR1P^.>*3bPvք 5fdlUaa+X,W,'xrZd[6?r_\Ѕx~|4?[R.LlTƒ 6Ħh udJ "J*F] wC5BVBGsUMf?t&[Vw']ITDl͚s3b$31͎]Q6 =0fɅJsZ䂊B$YؤCq|h"!Aۮ  l;P&`$Sl4&đ1"QMZ 7kxL͏]#"8 ^,^f 1!кDmdPGg{VRI]6m5{RmcFK_KUJřcBCM[ʀPo1ƈج9[||M<ԑqqWVd'\TB4E?​x e_GPjc ,%9Pf#}hT1ٓ1O:jr.g:TdQEw!z顆ۧvnsjC8NO58fFijPńɩRL1 +J)!go1bl/\Lڢ2ID㌎蓉F̤Y*ZA+͚eMhz;ӭ(uRYڔauzWvD80,NYSZI/lU5,˒TPH&Q2ED#Tn9VssK]R۫,kKhYRXPT`6fL0#H ,Xi[%ÛkEn+&Hy.=> {haӒP*[h-"Q0Ȃo ^bU@B0d?L0?j*؇@Q!U¢g_YaTq5 E B@ HmibCJ$g$]dJ!%l-4#'FoBn\ڱ~ Zm.5-:=g>G/G?i4-(O?/9;/gwiYc~o_>ڂB};y{^#4{gS5avxk $ގW~_gyU3Œ|f&e:{xE[{NVOŎ>sM?\x߉g~o)?ѯ˷\8^,tM/GG)͞^=hy7?z2A#LF+5?zVpqX].dGXq`KXb߆C\Gc6M=o_79?i\$t)hpyeg[TnޮX23.Ew;@mwhsv:þ/la?:y;d ey4NcHonSgߺ{YgAۈhpM|[ \\kַ̫55xYlI񒇕&+Ks:9=ޛpVsbp{x 6u}i QD>F]Aࡇ R+?Cgx| S; !bЮ:N,4~SV!j1&0A:ɰQU@l/R+gI$Etɇx ɇLYLuϳ@he%[m՜-2,ͫ(}`; IoXdhIQ[˺5gM!&)Tv Rc.c 2H:J^hš"DK"2xb&c1SJP>SR&f zUVD"X`(T}Re X*Br(n6}V,WLYm1J"!Z !X J"XIe1Ah`c11#Q6UWRwZ{v r:k\ ʺ8I);|[3 \Iu Qw&U' f\=QɹKfݝ}'-36uuCf+UlFu 1 XTUDm6llY}`RPxHWLqؤDQzDLE ]TcW@L.Pj']ݜg<~e?Jla# {sCQ,²aÅa#[\G.QC_ox~hY/y9._?v:pkϧaZӓPܩ˰]e2{9څIϣz%p >PwĪ+RB>w./Ȼ6_uu?ϭԈdž%ڝ}P>ix} V:;Z{NNg|g5hߌp;_ Mt{.4;}/?Oӛ~S)LߍPEB{EH`^?y]|>WύB5d t]4dB|e`#& ^ibr.9I! ̑$(.YA6R hpi Cr#)e3D5HX%y@RB&2- MK4kXC)fČ~9* =*d8X]_m\eDl=Ѯ@ݰ]nJJ@JjԽQ*tL+:WZZFcJ2w ;t& E=S,b2* Nqb}'}<+3`kJamSCS$}6XK}jV{/x֭y0eQG=&mYw [vyQ=U7zy2_>qI29 2tʩtmHЧ͝2`nnp{9#r"W1Eʤ^_pH(QP"4{{~)3Yv*$Y`A̕0À2ቺqH!(Ƌt a -MrBW+ TLEf5%߿{)y-K:*(QL%)jT-4Я)&YGڏTYPӯh8#/l^ +q?9Y=9h\nyJGhsP<:'WD1D#h#ݩ0Za/PqCȽ3d 3^5 lI>!YY&*Kb;fy1Z+&h'HAI+2!Co,}$Y.d y4ZqLO,_NKZx簋@F *+CM[- vܕII鹍А;}4t iuP Z[-+s0Ov'䱖uu.OEʥ/e2RM.l. 1gpu!-u [W-bi5ˮK}܅ȒG/oz"er4ot*IjU +'PVNX)QIGޗBXޗ-gl/W<(^Y'REMN5:' tHk 19I.A`޲܌zka V7P0,sZ Q>N.X*G!!"胰hL .⻵;35sV 6/8mFn]v u~&2(l1`PdxROYNVۓ铿ϗt6|E` ]ǤޕII\!#MZTvIJ ҋZ2$IV)ЖPYG06NL“WF.up&BF\t6 mڊ|tgWs4]ӆXcCvzqQS7Ms]]ud~_͊w..' | zCo_N\QQۛ51=ZBT~';tsPzhEmeRCsp¨=WnݽE>89 ұC ՓkΙc+M$5a*׾3dOgUp7r얉ͱcGOÎ,v[i/ho|[ G)GZ3DU3a_ uQGY+U<~әs#!Ds;AhpSs]Q+qF"ޖ4Yu@gixAR,"3d1D@|=+^6f2"KRTIyk5 @*,sDm31hFÚ&~gŮX9ξy"JmۑhH\X:wv$-j->?~IvJ\J,A ol؜v~iawO>)-* RAUaA ;UlC7MMMv@7}ӻ}o3j3C#6IocUtɤpVd-=OҊƁ7,ʹ|,HF {^-i ૠ+]ϣ|1 3}`g=Цfy{u`.}7up3.@SQ;UZH}c!$VH1yCxRQ0zHȲrEN ,Hdoz%+u^>d@is"$Rۉ/g>ooV{M{ɠryoV[qBi}旓mrZH%_z(˵ZFs45$SwH;"?ޫ>[M`z1ܯa8|g-䪈GW%CrU#51\Xͯ-gl/ـei<8qcY)3 wRa&I0, ; nXƶ #˄Dy>r0pGL^H2(r`)[К88!eزؓ39dutcu6mR.5[ףzm /Q_V{KO4Oi8BeyU|ҽ@]ZC=Mg(tmE\I$aSq4ḼݢDoo"z{zͻvԁ2l@l]"o,ٚa#KneÏ@lP+yQ;lkmĶn=z&H%(_I, )khLO=SLX/]}lTBK0s郕Z)Cb*!,"tJTă9OMry =z$W=fwFx/orzzvT^l(oz$e|VMzG=E\:qz5DTvD8E=̪{w'&ɥJK'QU D$mMT0Z@5a51A{ Ĺ1EIN 8s1$I$AUI !=CbcO9V  #g0jjMCR}qIjvVS}Z'q,l>zLSiR1ܧc>U{+a`,q=/i ʃkO5T`tD 2jȠ%HfTЁhKh&7^0-d1qj\@pσ?QY5H DybM b5Mdޔ^A| /f;0E~mwx^}Ko~>Mkh'2 MƟ痳kƏ?-_ٽЕj;᫏g~b=c+g7 Eozv9Jɧ8/M(i8[{8pR?_rbge;\wpE7ko󣩵枓xUBfB(!tvMdQWrMkLCθvu .'6G4{8ܛ3ή]/i_yqJv]jp/O$ډr4^y Mʈy?hz ,5- &t7vƮqE?VcK;^bX== J%7"ȷm~2(3g?qxv_Iy>K53m5ZTjKY*Xo\~^E_ mƄpQFu0oN6zZvwoM߮dx?׼Z_-c'yQwL< h\ҿW ߸_u*iKH6ϷW4.guUWvXKǏ,W6V{|!r0!SNn~jjvvXGv-z{1p1i*psYͼ_ǣ&ayXsc>7S.̊XGFM7^׆hn*#Wggj;-%uM}xK|?Gu&CMzSf_zwvNuwS|z3F<=2MN 3E$Ukm Epeqhc1{9\M.oZ0E<2I~tc nv,P;j7{/z DŽ1ki&fZOC{Fsk{zNYo6Q;u+7G'i^_gHZ߁9>8&xB罎iK5eu>,S8jCjUrA)c1+2t^T?bq`YcJeą'^pI{Rnm3FzWqV} |JNTsW RgBū*HooHݷPn ;y8m iSknJ WM(_.FT΁ڣ{ub!§WZ ;bQJ|HKR[Nhg̩1HxnZ9C)0(ɤ}aɜM}kao7];M$Q 3 z%\22ژQMD+)PƢyAb ^O9V-{6r"nmg]AR ۵r=/ #@v"܄~ C~Wvu¥ȹiƓG̈>:Q?)Hד8ry}>Z.Kܕ8*Œi\Ԕ)‘|:e)YG0Wm1 ڥdE[kGIBۡb'YŲ8@㵥7Z7̓VvkQj젙m{]j~_4};h(߷x_elד:e;TXM-bz w2qh2A<`^#t=ˋs̹,*Q55АhUGxP$kcH5}C\5]s]|"/+X;>3[n)(9Sмֽ߯>sgWVPm$L`J9&=e Dp\ RpoM(Ǜ5 g~>̮ ly<.p3Kfj+kL%VN )S[4\iuerzуR ;4K[EXɋIyԯ! Qr  [zpԃGZ ф~Rŭ>"YIc)!H8"qF'u((*4Y(eRwr4S| AΕMx= 7P,q7 pZ[ya"*eEoĠi6:BW<]@w v8k(vـo{.Ac4T.{7-@r2*ׇ6j-cc9Lm#ƫbG9Ɲ8b/x4^QB"Gvlƨdbg(o*DcCTYS1U#gϐX0F `$ŅO[skpG٦vn]-lcΌ mv.Y=g[?w{ȭ5m2uՕ@hw`?>5U귕JWͣft\o*\i-73/A>yev~[_]y"kNc6rw"+O95?}]G} P \rOP?o%Rkrmgͅ4H.yZ>k8azUvwI&$Jp/+'xXm'Xc$lw]^H4P=%:HeGS&B"h,3HD.*. , .[;*lf7:׳f^$#[g XltI(12ˉ(l(=Qd6:rG'&86҇* GX[~zP> -)rM)CF1IIXUJa*Қ95c9RLCu* U gw$4 mM}{jyǷkl GR dVZ4MV !(W(:(RRySLnx سeYmrs}%f2@b|LkSXc#gvQX11OEkC.u{e9'RJF%g#ӄ&`N?EѠeτTzgvD !hE{5W F8 IѨږ>.Fn}XӬOE#C5)M{ՋqEْp4Pۖ9% {` Sd窫X&޵u#ٿm_E v3N58|H XvKqCգumVۺvHV:U,ayGBXֈъ|Zֆ31tRB$-5-5 Lu.Ζ{b*9lrOqpqV 8RX'\Rø("'u]#L:!mcS.t>UHL#Mg_ɨlвڒk;-* @" Bp2! kH$PB*SmJ8`֠.gSJD0_P@Nضq}}bQsWв|YKmŹ6ݑ蟳2(-_~_}˟^|ߟ~_F0.OſY4 ņSC{IkvPNڝ9m>I@,ߌ8%~?{ӷ /uiλ6LK>8~q+S(qm=]V\r.;Ej{|đZ&6hUҬ^X ,5TjXV`V!0),Jj]]ϱ>}cd;c[޸RUm>V' B+ָv^T>x&rxͻ7KYӳYn?Nl^ ^ul~iWV]t[MV=!>`|g=ZN$OU5VX;Mc5n#mɅ[N4ҔO[mQj[N{hU= NiJy%Xg|%$"CbA)< ٙPk%-*,«a!RRNWoQ9e45gUXT!!M9vpK> ;K٢E&/u\Uq2EnDx\b,bDsIdF{0ԫ*gMEpjH-'etAخ⒈% '%jږy)vt+FY䭔E@) NJ;U'1A(lb19kllllZRc~ak_I Ja[,d.@@/0:픝&>:6oHGKX1D9"w(Irnr|F~fݽ{6Y B*Tت网|ʞS,J cʰe[?D!)R1[jUWIR$s!`lY*N &t* NҵϠ'8sg﷏4Q`ziUNpG}@Ԋ,?hy0???"ޚ6&bE7O?_kG•ab(".'I|r_M~h5M=q|xUX^MN;Bq?)`N2vxtbI@:vIߎ<ٛs'_-H.|זBs/ \W, qo.̻/烼poŦFhwޯѫ"w~,XNh^9=໇Fۢn3|?-K٫UugOGчt%t;py:Z|OB{DphઉUֹm&;J0]5{2a)$|4p1˔ fҕE!m\` 4$fA)gͥ mQ335ۚ C8%'9h V*i;CVZ S+}WENώAtVcRRYFɨ&e ko P&Z r.$BƌiRZ6vJicb؝SN8_S AXl۵{pFY{->Wf%t2CR} SVPZ*h̾ b* ߦ7ֹ &VeTȵ:W-lM~)\M=Hgob7/߇rzrRe؅D2 `#<%} ƺO(݊R8ifԬ*2Jm5D+I2f[T坨䪧CFo$y7$&ObtAwtpś\íK8&&-dovM,k=ZŤRۅ$2G cK,PgHrCP@ȑ-Hؐ{.AGU "=B^9xo{ L$E`ˈ`4@B 댶^)Vso!Lv,0:*f+`CيKΪ8 ŃAǒMWAmnC)X!R 9@a[U(oSQA`s + ) CNX|/@;R(.)UPmOH+CZɴ aB˽h`ÓG泃/U4%"X. Wb,,2 Kʁ@:5,Wg0!. vlAwb0`\ ̽8@ CR!^6  #/ tќoh(Q kU rXVڮw "l VJ!(uv a*3!(0s8e^a@X:Z0CKW/u z#RtѕpBi/U$=()IE>-85ƒ)X^,pVY Qˊ(RD#[VrHahA+v$DD[nXz 3|Mm3+UW ;kcQ˝k%,,AQKo=kcD% Ԩ̉0 @3Bwn'C[+`m5K*,BV#<(9PiDL2Ӳ VvQR0~a5cŁ)^\D2: J9[&P `m@u"A,Tח >ON]QdԶ )ie%i(d_/ 7b8AbC8*[]%_죫FUME\J:^#,CJKrD~!zQ)HJ"AQvE@x3u^_6뻄S)y̻t? 3K 0hK(8#PZ%RIĥowz;C!K3󺫇Ik󸫇Ia+w];+X(v4 XJkš+[rW/])g]moR7xCөr),7b6N_>!ܫ S-O'?_~lla4'09wt9LSaOz /IU?ߺ݆Ru) |vw~x1F"6!y* Nٰ{7G@\رVC (^nBx~L2(VGP\]+R:ڕyJ #8"wk~4 5GPZ]NzNfpx:<:sTK U8O no$̮Y¯--0[VΙ/ 7^ï:K~ޚNq"^'ඵJC^&0&7VaYT:UzA&hXvVX('Nd._f}Go`~}WT'|~}uslElv_!K:֑ͬ`/*#|7mwGnы>. __v-כ.bL9^Nqs'rϷOn:/iǔ Lwݨ{w;, Oms+m;۪ז1M N>j^tT&}.7T%UmTb:zXH|P"S}W] HǀB1jJy@-Jyg t\*"͸{gë́z5@\ ߎפ< V}'nUN^1zuP^7l(o3A=GR Ǔ܊JT)%R K,x?xdbfU;7)4K=0@왱)BMO&`e=`nYBҥ_%݈w?ٍW:>8v4+]|8_moAN )M_/7z* MsܢwMv<#\;zuu*sfh-*>&S&1A#x"RHXcQ$tvV*g΄N iW93?6 s'B1ϩ0eRd m "5J I']Ə{4,ɊgUsp iqA׈gfV\BD+LcƁ488eqyHћ-z[E'm,|moO]= ߏw)xe"!(uEX&TxSHq qʝ Yp;" E`*遍o>vd^w1o`jU"X Dp"62fe=g*%(Zs!‡Y:C'VNOOUDcX&&XͲO[%@1:q W:;[6'sEt`p&,>eYxU`/18|@p=8V.bHvQ+w`xX{g:;M/bRb.U ƘY"F38eL)3ĝ*p\?` d#Ltr /1$y e@;|&DYbɂ{5K2d yPҤ᎕e^aa!BqJat`w5nc! 78;'9wz2 jSz5$vTz*ww/\mPӫ^87x\cXN < "ikgU,1""ZXʰ9*B Mg3Xer=<Ϸy|6`0U$!8aLRŜRw9EI£,&{Wp^6k5RMVE .$72; bnC  'ԃ@Б8/ 4!8pvk-q7#ɍ?܍CLlؑ `,,-gL4E0W/5Z[`_̄|lmYPf# uutjv]${n -RS6Q (a׵gq0t$Om ӌ'bN0>GU:D[*Yݿ'^S2?taGtt}0i?Vrx&Fs&-p#~ظ5D0yۧ.$uKI$a[i׽>K~'?Mkp?486 nilOHzA2.֙ZXRYm-̛Oh~.r)sx23}umOnnt-?nrOeַ˴*30stgO,Bfn!Qvnwh6v{ҕ_:op{l5|bt״gV,tmn|&nva$|FaŃug-bviݥFnr񻓻e#zUv2Mz YsRO,>y1WɵZJ6++" i>? e l/v 8WO*/%[x[Tt-s%xCLsqzk`gg㸑2/۶~0r]$ؽ X'pYC EK<#HvW잇F4H#رYMWEVʩǺ".4fOs5̶ vm8zt8l{CGKb8¬UI8faM"4RשG=8z\pduILEyR3F=CȂ'3UE6q<:d|bnqLĻ4(̳)zGx=+^&f2"K +TvTX>XWGy !4ӯiIoK.Ug .7o t Aoxאxo ^f26*xrSl=Cb۴ҝ#HvɔX({d}]y%[%<Pݖz t]0ơm`|75rkau]Owѧ{ેݶ,{lYXmް;ٸrh+߶+Ժ9H두7>>ݾӜ!6V4F)h>+3g![Lˢm6Q1zE+k3;;^2(K}G{T6:TIDSP!d2 3Қ I(\9Uw`|vqn)s'';׻ɯYk)\F43Q!KJ\{k}a1IHP儐c3Eu_2E۪W\kvY͹4qi¶7"ȾeݘzځPnx/|=6L^_fjS:?~4|/I dK+~j_igIiC`dIMDTęSגgG%|)oK-eZHVie'J5Y<@ӹ&6AͿh}wet?pYGtx8`ݤ9nJ Ja˒}zyF4GEN#` fٿwz4]m/<9))6_mP;2oLQECx x2 N;X r_ eU\<_@<^Dxon QPvhQ2_/rM#'tKӭ8N( f ȄJ^#S" \-S2I+ &Ex%yG}?>.Xh{#y\,Yή9_9tF$eHEcB6,ƤQL kY[xmo);HggN}B`U"SDF)'w?Ep^2>sDո SPҊhdxg9!9#sh&v5v T5r ([%r NnN9*(FhX3b|J?t)`ɧ ׄ f&ѲXVYvkQIvR搝T*BX 悘uu,K:U6Ji x%d# 5\{9}5qéV숙~8[{H zٻ%\= 瑠7N}-׼?q5d l<+{E\M)C,CNDC]s˽Zcbb vEYdNJC 9j+lrdtGB?6g%*-b@4: \t# B+\dK.$U2xTFĹCmD?ugZ3^Wf25;2[*nvUD[w3 "N[=oEi%ѕM5!76;ly(@h{-wnĠ?b蛊jꊕhoWumkިm>vSx՘gOr06~fK6ǩ7?ufhvU -xHM8鮹U^6ŻLgfA1*;L4~%XTD@3.7q4u.osn|)lz ##pk5HJj]W/@ }֦=Y&渚StTr=#1KU{pֺGFT &ibらbhGFq$*∉K#(E<B҆ '$ANј($FQ-q,5&X>tՎ!$}fVs"{n@hRS[ , gHO=׾n.{Tzܐd#$5/Jp@IIZ:HZM28ߎ۵Kq|}E=smUed,Պb*EmE%72#?oAt R-ȸҖ'UEra=cY O%ivu^ yhVu-q6cY`hp)K.rpMGA;UL͹V!msD հJ5[Xmdj Me[hz[W[N 5 \4diG4`DDdݘp#1dB\lMBFiCl1eFjDŽ~VCǮUEŒ4DXFR(/yh4#uOd@ .![&tUgXEgCZؙ"8 @.H x"NXXM RWGٞu%;EqXe{ō ݗQKc.g02g1TWvLl9Ʋ,H@ژA4b,%$d9]*dmJqW\:u{,bǭ7&8;=?^7?1:~7$RvyNϢHIe|/u~wIK2b'eux1 o-/vW~^`딗< cLǧ<"|3޺b ۛwJ1߭;PI*o'?F46~룟jّU];:zqC JzK%gsU?gwkٴ҉/ӷb6e~ӛdrvw S\aLNߒ'6Vx5Q`3W d#{?G^xt![\?]mU݅,z}8?&uzEzo^w8+QkmFϝ%oR 9A 6Em~ƒhQFn=Zwĥb7jN {V48cww1So=Iw`k֢vKv0[]/dW))`׌gp(rJg74\: S;; OfcaWF4/}%-,[f9Y|u'!L|L-q-$V:<3TD޵q$B$y~?YNjؽqr66~ZL(R!) fHzk9c(co6?e}e8jV;u4pk:J#+V:"J\=ux@G k.:CWWUt(7St㆞!)a;&\%vJW5jҘV#ٞz*%+lg BV%'=]']Y[m6wqnd_r\G^l8]|cb@P!X~6Mգzs |Gk8dP%+뷷o6_ZѸl()}0wֱ&Z!H,``L'UX5ܻt{j x!ZI?Qc@qZ5dyL*^Z/B -Իra\J%| lkvٍvcnI , go3mmtEX>S{r?JhI9pV\0b10!SF֒z?edR ۨ% %VNbaE09,N>i) 71[L'Q9 ҕ%.%@`.c]\!; Z[OWR>{гCP~KwU; CiZjjۡD:DWXZh)m+D1tGt%aiZ̀!#Ciwn/nm%]Gvm*o0/GڗQϵgq ե7q_˟& -9ǎs|qMo;7XcBdZv9J6/8b1x{ 98:2́R|Jo9IP^Hrkq*y hwKYQ)- hYc^-|Avgz=%(+NG z=m9j%.7} NRcL192. ڝ~/TͿ:cC_OgK8vUٮZSԴ?QSu ,ph?BO`jLD )`5HFssfkKmǩjAR%w8U LnLinw~QӪWzW .V$r7`*WKy "s7ZWG8b))_\aBtyfRs=ßUɗ%L;/f0lG9z)\~?h8}Cb%󪖛/@nާ6oN_RNū$}H9LRzj{Y=r{L QfT§' B'NQ6(o6ImSmH270Y! Jh%&&4yXQ; g9˅Mӳ'giLߦ NO'' kL3>rjTuzcՒg"uyltt ^Ka ,ryХW}{a=(+?{P3>ɒ8h& rMР"k^gK"YfTFdQ)I#u36Y8;JpHifl ؜6&rj Uυ ׈msnTSrG_b< 7OϜiVEa8G!uҁhvxL ϏfGJ, mielnx 2MtP+5YNL$!ei0c$nmaƤc[ Y`xrD$*ee2<9l |X:i6ńFlL>eD0#{F7aTّ <{.h =2+sbss:&L6ʈPB')8YVE-XF#JjLI3}A4̈y8I0xq^,UΦd+^dCBEbϋI|r t{L2 19EM2I Uy*9e6h0dϋ‡IǶ|Huy#P:E:-X?;Yqs%PݝƇˇ.R@_Jśu p'A-W4[%$jp?")Hd:mPpC9xhqqe_fa)"AlA/ CV ~&a#6n*qFivtעʣjңdua:)Q$*ޝ1SlsJ>nJjQJj(%1[%?arJ"&꘥BK0T`b|$BpLc /IK2F" 3x׌I).ThKs$6m6Yk6mMFW;W Q(`J>uǢUTo7Mc rDwκǸ)3jN&㙣Di1DH^&-QNO*ߐuP9ż6&lebi61 \ІitgF82X)ՠɁ"$<4C\]}, ],7@czb6ۥjHKep3ɢHix&yOaYdH Ez߉Ʒ^?d54B@5h3<*I*l1rTvDs+RM`1N !k<Q$g`eJ 4 ^L/Aol&[* ah+/%{O4hRI 񟯏xϯ~뗯)3/_1  ~3y@kI#<^}u^Azgs LwvVs| XҏǀՓߖO~}8Y^xiVѪ'nhp~v ޼! \^ںnIȺx^m .“ T9r`ekuO,y|hqzvyZpn}5G=[R7 g?ipc'_1eY*(_/4{~iy1oPP)QݗN9j\^0z`c7X c\ˇj;n`r>ug`sƩ4YTc//~-W۪uqc<ΣnXSTa0}rh!_n(MpnXj-m(5J?[1]b|woAqF&ō&1}xNnqH w:4uf]R`ww't} A[C}kʒ{b`:29\U͛F^d8n0>_ UK^.3MZ-r˹K:_G$sO-R-u/*F,QR )2aE$gzH_9Ϊ/UӉĵLiEI^;!)Qo"%JfOMOWWu$8cX]=ߎqisVS1 o֨ 6vEDa oyO4Z+]u4,59Yˣ o'nkx~/}Ek4 X5&xH j?Cx7d‰TXN[:@2VB‰1aaÉINDˢPc$Æ~B 9KGA2ds)苢ORDQ'iG >dLR3 :#cy֨dc׌*ͫˋqz ;JdXdL4jILԔtBY lt1gI@jz rDYER+:cLj%Xxb`29J9ba3R}>vU6fc(X}RAT=$BN}!/T2 hIX~tFYJJe!,YH`%xx]r&n#%6_ [w@[w^Πx w [}.,DfN;/ [8.@Q"N!]Aul 6 +;–F/r 02`S]|("jb7DҠXPUۖmزQgR H[X2HBXTL`P:j hFv8X fzn)VOWtS^7Wu*~+Y0j%8@CAMv]IA~3'\i[=*E{Rޓ?NIJsϻge*䨼HA̪)]:k_ĂU#6+BuL'pD'p IQ;X3fK0NG%r^ K|ER94Ci۪틭~-XY ع2\ >uJDn$mz?0n~3/ܼ..6̷w;zɷL|!e=S.";tHĶ{c $ lw3:5>-mwCN (Kbl"D֒wfdI@h= Ѷ1r` %ĤHCMWd=,1 zv >vW|};#P'W9On޳s;]Ba|L}٘%6Obds(Vf(&v!'ދ`4"ǨdqYV$M w1d;zm3x2 F:hmLXDI "@EsZfN.H'< H[4#g VKNdzh]+W#E]OQ:9ܮy-;_]7|e#Hm)F,Bhl4)2 *Q*8kޒ0i#x. z֊A7C2{J 2?\"L1P-X"e&$OSẏuߟ>JDqr`湾:+S#f#Q*D1 iB0$MAI͸ޱ~LLo{f}-p`hQZ[v㰝@X;lZANk6;A3\;8U2J&1UPJXj%՗=ϩ==kcLkG_y_}'ښJ}@$s0[^EcϓtZ;cNkNu)F8i\f ĴCDjq)Buq.H^.G7Ƥ\7y}\eSG>8ǯK|X*S$&g=\E2hC~ͧ?,1,FtɊ?*sB}kxqIv3ͣu9?=Y*o]wBR~O9}FO.vA;& nC2Wo3ߞ2e1|R+rEv1:9wK-:pIju]_v Vx+?=f#r05B_)MM&lLwmgikfʐl$p Υga9ߓ.6v. V{&)NƵlھ·Lcn;lട- ){u,^K\/|OO6+٠lQ16k*Ġ!)"ZDZB#y ֑L&b|8oKȢE5:"kpO(B/Fep^:YI_ba^f.iqzM^݁vGէxf3(ӸOKJB4ąG/pGRԟ4q8P0\7|HH>t\A^t2L2 :RA(Ө'@ڋ݀6S}Z|;h̝Ykvl`Ȅ1NΛ P:0]`?{۶"X`[PC@M@mq^ aFlIJ([,ۋ4fws%攵>U6)29{doS+B.;EXNTxj X1%$ֻt`eNHMcu'U^x!𔮵̕8٧s=40Z?UGQG0klfUR89|! đw1YʤqY{ڎ2z86]EQr#%9/"5͘}S>|CΕ),,'diXz= щP,:‹DkgІ.va=wޞI@3+M4Rf4<,إ(gm#70!d=yU !}݀N$Y֑M6'1<*.c$YHB> ;Ԅ,,R.s X s1+Ws0v;*^5sqߟ:& tU:|(=hĽ2_DP ³wgki|| ok&#WvWҾ*ت8%uٶJHJIe |ĘJ WbM +:vReH+ ć5mׁ&&d8#Љ񉷜%*Ϝ>9ؔ;VK3/ XbmY?89NwmwDz k\=]='9VrfS%H|Y}q!J(c^[kIʴU*֙T)ֶ+TuǠ㙪{樵ieIqK'vdr-%<Y刖U(i*?ʥDW ]!\JC+DT Qr ҕ\r]!`#+kE(thM+DmKW'HWZHl@t %ڦd:E2B[+,L0tp MeN,c/j68H_b8=HZ0ZqO'Qà(bto0ܤX%9ELlRG!vյ;Y7-qO:8u&qQc@G~\)8+[ Kb%U'?w7h^sJZfP~P6@30_痊Z״ދ*<`aؤ> (b6DՉ>1qtS'h8!qYHY UG#ūMjj%٬O8؉L;v<̇*f1Cd72nvQ7-?*kIeDGvlԻFB\bz7Dsg47ӗ!7c!7x`%ظ_)Og=Bhyg2HGcOE8s&Y hoǾ4gğ/s6APpں5WOm4(~-&< ~.7lʋq[seEBQXq'L]F;a*1s٫{joz<5Z婭agl+]O>\rҋR>wݢ&_*oco*1w]:i,q:͹ۏ9J1&%G>]_ 7iw$3p %;hUg(N ^t3QA^iHC2&[kq' \l6k䡨o]QjöҘe[fZ4I+{ Nj)椷L8V>RN"dt+ 7wl5ߢk;**p>,+ρ#gVTů%@[xj-d N-BuFqtyo\O%75q7.CyY{ |ګ ?ЬsWb2~GjHm)~8JĔP<{E8\LC}~HTړCbz'?TϸgU8#l򊚜hp)-r>E7u>Wj糼lgqFiy^d\CqSۡ=uP2ެT"*TMO%VV6rnǥtg6̔aHuebi {c,"w _ɗh8ޚAmqfȘҘɈPCapOQnW44 (MeimY5"/٣dkn.a!ɵh8M'r _YSN(p Ruphwy 5v^͆fLf7Aj$^KsQ;芆]}q_,j[ FɈIM8 (A \Ch!Mh mDs 3B+KE(thm:]!ڳN )_ t()iJ*a% Qµ*JʛNWtut2&$CWWBV-] ]iO-+i8.f'<(ej?E2`{2T;++ 3S+2T;$B&`*AD3+aѠV1+j;H'3l6t%[ڵ䒒/S]"t,Uj]]V{ea_\@:3=jN^ Tno@|TYC*2ԳK"9qFh;SC I7 `ł WPtPʡ~n8 ;JW)Hu"Blh:JP tBVtutWe\+˃I"#C?]!J)ҕ*]!`]\vM.CKOWxKWCWrʾt9j|QV]"]i+4)Fx0tpi0 et(+Gtu:te a,Am0tpm0th9o|0(ۑgDWjæWMF?վ`K=j;#ۡGZ JѰŻj R-](ex@t"6B:d+ҖN$Q-/a]fTuW c-ʭ-'$3.+ <6\MSK!yvlM/Ef} kZlhLq3fR$'pԡDr' %6;HSAJ<`i0tpM+DyKW'HW`4 &Ohj:]!J۪S+ q I]`CBWVʦT+ T&}Zvh9k:]!J)ҕf=ߗR=iۡm~QZ ҕar]!`N`ADkOW6wuteA+l0tpsvhmG -]=6N9>]m؈c箶{kdp;a -JtkS 02B4-]"]I!*;k6oRC4,\obkc*ߘ Mt\ФImn\, F6 pd=DYي #87{\nXp `2tBTtut%(b3);3^F?g㋨+o0V(hKiI|g~}h-\?BuB'޸8QB #}_7/o8o(YI'ӛ~/B^)?0+:;z<gn)^ p;,}v+_sxN*] VI-%[UemNyWJ-{stԋݳ3Z)Z)InS4O׉ɭfJͽ<Ϥy:D$TXY]@-_uagG@>nߜ FtH s|T]OҔ$3XO9Or4M1RFeV&Tg$`*=2l{li5W1T>:sw6452L|_- @M-.i1OQ v.onr^. TxÍY*bףq3*%%[ƍΠgc r1f1Wx&v^N~JKyTEpx^T'ezr9GJvzxS) S2Y7o}r>~·;WLAޜ|x o/:L.@]NڵZ3x|^O^Q2AUvG[5mݍJbK9h"% Afo~݀%`ʫ7>ribTw =rWhQV^oF^%5#K_cC#xpbE 7oe֊VWiiS*\3MƬb2Tq'i88LG)ܜ幧#EUeBd4ÝaLk@IΔtDHBYYu)YqqhpZ1qT P4oxjM3f_uiv^Ε),,'diXz= щP,:‹DkgІ.tޙz{&ͬ4Ep쫷ا)MΣA/gЍ~_iHS : u*ԀrmP=^.+GGf5M\Ao|^b~e?9C¥$>QfD-zng(Z~*`Ax͇vPmJؿ?0lVļK\{A;QףּTzו50sx/G}`o[ҸnĆGwm`xPqV] pgE>fZ)!J Ap^RbT^N,I:i+Vp]"`]vt̬|r|5J KらЉ񉷜%*Ϝ>9ؔ;VK3/ Xv)ɱ45vbɻ2L, 0FExI!ZdVLK]%'>O!5v7i6@SG0Ns:0urnJiD]4\)^;&d͂aLŴa|;gRi'O ӿTLSlvcZF VXd*SX(%#!2#KfUgΝT,g"!P#Vhu/V G QA7pN*XE2MrQSsҟ-ᚐQ$Z Њ2+{-F3Acg? qE\Dgp! Qǖ;'kէgU'dU'qP%"sxPJQ[ Ff ݽTH%k*Yg5 |JxG}׫o>dΏG0}*Z螊^b]:\|衶-'1,Ve>'Y?فB˃י[ 랛 |>]+ĵ =쀻Ǭ3P˼Vt':c랻?r!SebAl%IT5dPrNB։ֳ9z]V%!A@Kxü7?yhڄp5h{da3 IL GI&Yi.RvHN8u.osn^Oy#pkE5H* ][9W8 XDžmi{Ov}qsSTr;#1KYRKh°Q=0P &&jXui gL\ yB1gI6D\9&'Pu:DD!1*ҥhcY7wm "O)r=3u-t`4B D>3=7QEL wT4 Cɩv3N=׾Шy=Ύ@ּ_ H :)j&ajgٿP$X X[ ݖ̓++{QɍH/*[ҡCJwwwmyR͝JY˚ix.)O<C% gvЭd4r,v6!N۔}ľYɔߜk2 %cg9 HgX3Y̸,4B£ ?UA[7tm~jr{0?.9for@ʤ4Q&9'~Ř*`@e-糬֊)]9X^HHήO/|2uŀ.O>=^71w/ >J=ȳa(ak8uѴ0I-+o"f5p`9KЗx=;k uHutaЈDw t<^ݘ?N^v_IeTS5a.VuV}^j~ـl~F.^ݪj2^PuTEU&WQQsWW4Oɲ˓j5=`8Ɋh'/"G׎RGokK,:j' &JXD2g1 ۅt;0TȖc,nD *E0cI$$W!kSlnZ1vuX &Em=oXs5+\^z}懯Ͽs.~uT%Z @*eRgQZIyV~x o{7ޫtj zEBxgMW{|iM-uS«u\p딗ϼLQO޷o\~j7ERps֌-=N[Uy(:_ڟgSw~_j~ŕyF|~VIoݶyz673\͟g1<ƍ[޴6oO^|M5NpQjtqbk͍yU^1l ZEVϣǗ7!rzw6.0`Ѭ{׍E^C8;kmkU<;ߨiA+[&4ll]ڲ34?>>Xt;=f--6F/[\F9ۇ)^&ꆹǗ1շ.JػHmh[Ո nVv5UJ;5 ʴS5U?=ܴ7Ǎz-_Ix-іO}47+2Ѥm%/8m,msSǍ&L-q-$V:<3TDJBh2i*16rzv|ǂd'D;DiG Dcv,#]D|<`3ۢϩ,ubYdNu5'y:Hv9b_2>zxVh3h Q\kqimT97 M."cZIjo9AR t!ɎGOlu][좺5|:U?_@oKEE #dJޡXrP/Vc'x A:ylg<JA>DH2L:@[q]m4K{X:y].Zj*jfzQGՌʕ# x^UOS'7rovl~?t+w)ÑVSF}|+(ya"}v|^e Ζj 'A6ৃR=^s~4HEc:NC=KCQ JmzgvtFU´U0_G=A,]'hϬ`颫N?)଀FYԐ8fS)B'~42!] 31P%yKr5Wcd ]}0Ik[%\d`iB)-#LskjfeigoAiLѳ䕷g"2}t`2lFͲǼ({ײ#vgeW|AF u&(l 0&USYgpvԳр!t7'B3\0(B`oUIfb@,E>Z=@*p'w>(L} gyN&FBJL}YL(U\ՙa4̓nmt44(@d6pN-Jξ$YBx9! i8zu43x&G#N+"d[MB ԹeByDusZ*uV(Ӡmݭr>M.jdLDU>ݱcwIS,ʭʎu-%PK8 4c(DLĴA=S1V)w]grvtYyXoO3EEH6 4D8Y9$ &Ol ٪;|]l\G|v:D`$pWTR 52)3Y~яs-{X{X{MF-X3W4N4ȁ /s${?cr92R:̈́0di;5m\Hin.E1+nF)! Os|`2MoCOp[,η3WJS6AN+O`b>)lӋI1+5}d~u&YGQGkԂx4/!< + a=6޸xJG xtN0ȬɌD xe>ginb E|4FIJYr#2JUb/ 4)%$62u.3ظ Ύqgޅl6r JJ?#ݻ*0'y<[ݽziTmr+ʈ }x?\`ؠSV1& C=#U yiީ`#-`$z50£0Byr!r%MzaȢ; PT&Ho,(9F)4)F;Az9xQ܅:C|3ݳų9Ynq[BMw?hvDtn_pG&[3l6ỉab,LMj '"49S =O73->ƁzR<ύy0R,%ۮ2HpRdE1x,gH:'e2y¸}Eo9^Zm~J9 1#: (I^u&QZߙ_=QFY[tvW áC(zf7ّn ~f^b-9PGZDңIx+x4zUZsZ(+eQC&tOףx0AP[lv٤6}1I\LaqR4V_G^B}{Z[ߤ@Pk}7\oP7'QHXCX RՓoF}FMwe;^|R4qT*QS"t< w.4']ř7Tqq\Uw>Oh8Fӳ3ƍ*D_pCjG)7!M-xvqFkV3mԧO|p6nl}}9XdbHC*6Hv5﫿OdU6[u[,܊8[оÕ 05\ڂRI@(}}dtn >20[-uxSwoFen> y2;^/lt1/ XVi]i[n5rۻea2M(/)JIڀQ& R@ V[:Y4;>?:p6#! 8@l]tCĆܙ1Yk|Z*\gq`q(.=WYr,TVXnׯ>Og,iJjgAp%߰}C+qѮ> 2W'ӂAf$d-RѳfȾg}`?N_ńY ?$bA\Ät6 m^?!rY6Y^VUJrnXuR]chTˇW~nA^j'Y&% ~ZTϚ=ê-);K7cѭW޾Hju6 ^U9~?AY#UlU GCWUA]DGDW0+k;*h;]t xLtEUAtE({c+c7Lj?O>.Тz e/{ЕЮZmN76JmW׷ZQIBh:bDѿj:WZsLޣkk3'OzcK݇0k4F~<4] u,4]J;MJ4iPÕ:/F㆗YM~Yt7!Aj[盤F6ȝN('g>mV ]J)q~z)^S˾DaUt>UEV_KJ4֧9ݣbU1`WC>w%TR5Tu54:;Omb?S]Nʓ߽,dFЪ N?:N>@9׽sRV;^'rkZdˏƓuJ_rgVK)}]uu9>CS#}D3XZˌ53rcaѼ4Jc5L ] J=,DR `ˏ \dBWVޯ tJc=8A`.g{GPjw*(ҕ]HJWtUZ~,tU>>(c|HWJcr*,p4C-޺" K+ FyDtU#?]5XJ;]]B@ uGW \ftUPagYȧ|~h]zv{tuhsaðف?+'Hhm6l=G54$kgi1yQ{(_6<\w頵v0hy̼6tx hXf*ps`Js{DvClhk:wDh7i?[u??d\y]:& ek>_EY?s-X߫Ut5_H$Qޮj3}u(%e#KR i}EJFwsQBo~8;_b[^eKgr37l396RHȤ4/%]̅roR.{Ƌo똯Tz{MRkihRf,$dcf RHDѓE *d13$ ؿ՗#)n2%D_}@u7D:ȇI%$I$L8U04*>$h*-]B)0Tg̦\ă&R I!1<=ΈF42Tg7gc*3 .T "k$Xz0$7XA|d Λ[!UK>+# .eКϱDCIpѐZ(}HBdLJLLB#pa|2"?{Ƒ] 1d{TF8@KQOk"R~`s9/: 塦-v6d luuD5bԂkW F=F]&>Hh"X # o b9PT8x0olJ7!+J$W{HV*U2P (yCs 0XQ|YΈ ʃVs I"9d^U0P>xmBV5 exyy{'*APP#H܎m }KϪd!T? y E*Ɲ(@6T H? VEߒݿ=77~]. O&!z*4}4 ]{ #BK|u)}=ŬuԆHT—Pw}CGP=n;H H j#`JY%PKcEX A%DEWHPl5CkHhgՌ6!XX-;l,j3 IȚB\"(ά$1S(ͮɀ!JP CEdU5\J~,*fa!dE(3A68FtblWNX)j^ ?h}Y[H$Q5$YI$e(m@VӥGoU4^"BZF7k QH6 W|ZR0=UK mr0X~={y貹O`t3n׷sU&VU[&n[`3 .V`&Ǧk'QEEmCYk Q^$zhhPAI; ArV{R@s80(QC^":$5\Qr#b>TJdp  J`QtQ%fdi SRC|E7o֣bb ڻ+ IISL1. ;X`g +U@ &ezBE5FHmr3u#=A l?uQ?uڶo iںbP-Z},n$7Pt {eЫg_z{֊ּANRMWXPFXӧ?~$(wpiBiP4}4O7;^n_mmv1vMKomDC-iX%qE_\nc=Zm~?/~ ]X%穫&Ef*!d[}_Vh=b鮫`1owE,;Gbǐ%]LZ<Smz-?c{= #mo64}2E)SdU3*>"؁*En0LECp!:.!{&*hΧ+M{a*7 ]ZNW3+]z ".CW7 B,^]ʽtu>te}u K v"zPr1xtE{i3]6S=?H''VWHW?tL\7i _I(e:C 8q5è+BkOW1]#]`-0tEp?f535/ eЕ|fK<8( nOBWǡU'Pemt%>.e"ݕW*\ + ۃ#Gn7%߷벙jeN#^ ˸A뵋A.'WljJsJ%:E ƧO`M#00nڨuԑu[Quv"':8Ax3+#dX6LG CWW CWȥ 8qA(tQ,^]J阮ΐa{ s,љfІLWHW^(@t'wW t"џ#]Yˏ4wCWVtJcU^8]+5RBWV4<ǡ+̮WB aOQ8qpO|Zho!:k#J1]}jKB4G+k+BPtutefbqI$^Ys%%|QY9̱Gݘfg34r{P#7#x!@bCvZCcg99\!T~Jj=J%Gh+9B$WrgXiq&u''K;Еz?]b\+B{Cyj`;p((tEh\ #]9텑H]utE(/:GXqY<]J*HKslW3]ܠF+ b *kHSlZ@p0sWV/~PDW]/ R~S\|\%O8D.l*}]iOziq " CW7Q hթ R{hCXO82 q%ܥC#ۇ9OJ?tBwwbϓ:" {m P:R]n~CW.:9oj*I7TAgU:oS5eeNA s R/:N__w뛛ݯ~޾Y}.mz_R?z»3:{l`/beJqO~5 z0dS1&9? ~}W~S=+>='%}ޢ޿wo|ow7hW~1_۴-o_~,MDڽ~?wmm,!I~$ `ݓs8Y)RPV )RMTS$ڰ-s=5=UW]]LJĊCd<,rVo|n{i_~o>xbë"0 rP<˟߽oU]˸iv(4NѢ=3<|3 ?b(-;cz:Yn_^S묮}iu 8os̬?WS 0C"QSiu&TdA-a{MjN`&dͬWp歿#Є:V!^eu-NHF?T05)c$pqwzȹ ݁2b, bQ}fqRҎ?Z59X/w6A>[_BbWC?o?Fz4=`09seIG\Ϟra$ϙ܃n ЊY6Q) 0B>J #* x 4&c #9hqtPo5B@!B`!D)gK08c:dTa)&C yUe, V1 )s\t"x 3y`Be2yfv扌gzì *sxY?m_NwQwP+3Z#E4 >/&jz"KROI 6<@@2 EvO {N ~z4g=ɻmMg<WGa6j[ß?Ya}eh9<  F_1Bj0۷G/N&q8OnNfB}}r/.zPFLMϬ]M:w4O7gسz >Y5 ͓'&l)OlALįT1z&oQiJǗ ̅a}ڼ< p*.? yTJQ|eqzW_/ǧKrmZH2eeoaFyaF+̸),˔q1MT*h))Xo&FHAM&q0eu\J#b IJ6 HJ 5HwsN&< 8AzjCbwҕ1 \:zbu S~'bVB#x?.u]M~1X A@ݿ~ n埀0`'u5#|&)?Vbm`sE#|`S؜]"-Dr5lzR:1kwj |̧/i ʃkO5T`tD 8Wa$3*@4MKPR/285. A[?Qf"<1ຘ8[.qr<~LJw\ԘǠir2m~ K432ҦBOH_~[dx`se^;3\sy M##d]ז:P&0N$3)pҒƂ4XcHPUN5ф{E"qC;&hȧf,Ak$'gJCsT;*s蟳/+&f0t=r9]^+ͦ:}-b}7WKWkjxyU΂C_!6Q"{,3/L3+.:qcN ,E7EG [ȣ)xFy8y^A$(GAB5;ː6yH2ExT=Oϓ:AO KGq>xVcTSi['$e&6$kAQpNdL*(db.B6-(2$ԃ@QifwĆy d!T‰g/TW[w/r9 b ?P7M)%?HM9Y߯ɇQ?:?d4>A%>JO2a>RӶei燶8:z̴l:.Զ 1`y۫a&Q v{ 4 fr"M.r5MV) 0 ENҁ!pW<:M w4y"?q?IoPOrSz\IꪎI'Kn7\?l\BD)DYhf&k)%k"1-69Ъ}l }kO{\klca0#Fkd_zU3ɦV<2}u0ċɿRN;?qMu9>-.  ,FU4MwJͼϟaZq`W; gS JpӺ-蛩K}S݄"um+=˔,BG3qHι֛{fzCmS;o[w ,+N7uf oc6VMLtinhJۻ6]@`A%b=V|2\*ǚ©mݶ Ut'~NjՋނlM-2$pgW%##\mqCd2Vkѳ#NCG /tWWńkeBpW3%LOpGGpHK@"rĂhBD H@83iRlRp_d}xt`G .OWcgvUm㋧45C>] G$A͓ N(+pZ:l 0OǭeCotOӐ96M.M;g.%o0k  }4!FJ?c7 }Ƨ8|ȯSBD*ιXUYND!<ܰ}%윳Ho풫P(%`-ʳEa^~szoK3umHQy&.Oh^o8ȍ^]w/ѷኽYka /p!YNL2lM Pd{DZτ$<ҺVq;w|/ -dQAI^VZZyO谉8ƒN/uGYf.bqST@C䬑&?7M+7-aTqΰl|jrޝVɗ' A KFHP9Ԣ. Gi${nR*Wt*&oXQ ʄd ResWD+=!K3֌I8Q%Og/Yǣ$H!b oC\mh^qd,4{C/[WٗR-Kk{C)ӱۛƕk|OU4'd $IO!GH4#PZCKo+%fC)dqq^>6d]bzQN7*Wҝ6:9iAw.7POU{E/8HQEK-I!Im)/љ(6)gUi ZcIYzFet !9G5$`B┡x#uHֱ1TCIabf Ua_pΎ\edbE{O]m}Snw * Fz=6ɪ GRy2R h%ȯh.PubQhLI$p2كwsYv\\1Y (Ąªc"^]ko+ }m(: ܻF'ȇgYb[ҌFc Q7]lOŪĹhWL̮vkc_ԆQzc0.)%Ӝ{k 5YrNxdA.Mck,OȠdmd;L̐+L@X F G(|:@Fu׶ Kk܎Q??X+eD{Dܺ%iTٱHTHBpE>mPޡC/%ٚ1 m d݌mKq:'uqŸI)xty6߾m~zujJB$Foo84,ԌΈ FI)e:: NjY9'0N> 9 ]>[tbzX W$ (\9*,@)B2R  (o0Mϸ̓euKYS+N+-!ctpY -Z 9u#egZ8Ԇ *$yXSN>4kdB!!)nmkfJdb.<Ι8s(|]Ega10ɔ !OqHٙQɘϒIBNVMq¢z--%ei+̒4ggqF 08?M~_^{OxO #X(NqaK-qnhٺr[34h3[ n(#>}$6F/$.OV:̽. V!n0 vԝ)l4:. L-:z֬%]lc}ma klMq 6؆kīw5c ʰ}hXq|p3mk_dx)i'>^ K\Mh_cS>] 1udF{/P.Uf&$41P֤(daLO@e+RtY#SʽD_%-H!u.Xrh4bA|@:!HDSWeſbŠ~9i+6%o_OIq ت>ma (5@ryQLgdyi}e%sGOy+pM̀Ldr!3_jV 2|k2L*"0/)+w!D#i7YeEĐ$`,Z^ hMUmlֆc#8\]S{\ԙW|5Z⻛'<)aၓH?ŏh:dm IOLHf(4klC{٢T䬤06(0ɐZp(dVrݮ\-j6 =Sz sq#5uF8foz6 ^u{ϖBnm< RXBW]j}g$ pJTi)]OKo>p|K2_JTr&uj}k5/ ^,5>Jn_^y1S~5U~K|YzbhΛ?/^Rzc?Irm'͕ZIcQؒ`0XLV1pS7r?Q>;dxXt1%Mp7tJsdBˬ2$mK lCG ))DQ&{CֽL', +g@rEB^Zg\]r8VNh~d|\qG$;/oLw?StsIZcH1 JrK6##!RbY 55ͭ\ ^Ϟ6EϒW">:@F` fYc^,ֳQϺ[wy@۝9N9 ƊPR*Y h tDrGfѸ{hS>ti= w駎FVm% g(F<21Y(ETWOZZarIgI0Ԡ\H² sVt4zk A8ZNimL|!!}6SUӊH*iyIH.H/u.L(o-f%9LYI$˜zmkoEd+CwEZ"*nYfv S,{ʎu-%Зpd9*P,i-g*$ўz,Xݥ$ƭ}ôk\5nbTn,þ0y ('s䉜M gKI'}qB~IatKEYBGGSR+)J4NG#0gB=F ~&(~J矾Z>FG~RiBzHfJ-OPѳf^g}773CBa ?&B<\zt& ]w(\v:gM?|3OTC` :^M/5OhE5xv&q߭yp<czAdh<]/᪒WĥFcfօՌݣ`GҌ0qsmy~tJYnF(["ZIk ZQ8PNs_ltjt|:>&Qp8f=h`Kjh]߁6,ĽshM;Odkom4 A&Z9$Gk5#KL[5^_OSI,6OU1,E4&T15ܳZ>rsUcSOpIXd) i?5[n}YaИtc^revIEx!ݸP+Vⴣn926FNSio>Zrx0JE19jM/SYJ場wm JP;^'r%<>b©6eW6xK`O:Dd8R$*&霹VYFTqj:gݙgmq_%@F^D4$:rߓ ;w:t: :ǭ- X˱r@*IUy8TJ iיf'l䈁 D*tqKܲ3Kk2.~c|ZҞ+n+XMi/pgFeL*Um$_S9k8iQu9Hie:C{O WE`kH\PH+tH)mW_!\AUUաI+Y]\pWE`WE\HZYH)zck+ RC+X:\q>"i]"V}9pvzFa&0'O Ww24pu7)yؕ\R.W3W٤6oP Nh :m`bXc&2~7h2-kQ̞5oX 1y@0]x00]Ux(0]50]`^ڔYˬjя>-|ryt9CMX9*G~Fˬ(S*xKs٧hECc\T'3qj02Twsk=?3hBUTo?}b8GDH#v3Dw iCV@;g\{ʑ3W#@vi /E( @|d/qF<gs^bNjɰLU46: ST4B+ T4GXX6֌+f] nPJhyHPW38 a0at%ю+ۦĹJ(Ͳu8 xNe?@kuJ4J]!V^^2aC[}Yiq92mcXҠvkKZaR/{^eI=̲Έ+G>oUraJNp#RɁi3JN(]*#ak@`ѕrEW%sוPn8p 8I+uv] m~rxtu +v] nQt%+PQ[HWLJpEWBu%]ʌt8'MWaA{2+K1xbt:R1.4Z}1(~1*)=҉ ~te_\GѕКJ(-/ftvp1~=i?jn<&Fu ]M'&*,zlk @p8in] m2Ңcԕ7fD]V0"z߄֓ ;^tnדZ.zRq7P0yz=iZOj֓Zg#F0\Gѕ?w] eTPWX93FW(Z@i/:B]9Z +>]at%Q]Wkʸu vbPY+Wu%v9D =7jh] }v%A/:B]ʎtF?)Pp2Jh(JǨ9t%a(>*e+K1wz>Ua7ЗL=i@F9Ņyxc^{VDљ g.:Vx;Xn MX7Q4-nJh(5"Y=zR6Y;^qrri@IMeq\`qVaKK'|ѰgS#xqFpZ^%9Š ~ ] W<{ۈEWǣ+bgp׸iqzFa rn_t,rl]IW+_<6HW(پ8AWqc^{I?GPH)uf KZ &lw`r5θruyyZo@_C|?w7Wקzϫw.XҟozBҚo_8_uCm.>|F_$߻Wȇc) ڼMc}9E}7'h7wϝ|N[9dC^xq'?Z}&[n }8j _nh-{ ֵׯ닓0(+yg zQ/>^=zݩSP=|U]8Mo;>5o;=V?l_ھH{|nn |wN5i,׭qu(eX T4E=L`(Eh܋&Pne)hB( +?p}QtuJvj0+=LpR'JE@⢫cԕӊHŠ;3w#hB˳Ϯ2.GЎQW^[ 0S n N^WBih1ʪNO֕U΍]p8+ˮGz=YLVT#J5'O3;M)^x_^|yAx.ԿƦpP/$ʏ[}zuw7_|'0RދvkN=~h0>3޿c#^ZeO3j#b\@$P~a~={?Yy6ρۜ)ďgKl#_#{[~.s~9_m[Wg:U+`U2Οؒ!\qoڳjYrz]랓E'Kx97=7wHϑE|׭J7~.Wk\JRQw_#5919kj`)tRΒF"HWy]~9DS&ͪshB*U8BʹR/zPէ?CitZ}:PJuU*MԱ` .U!kGls"4JK2L~"԰:VK6ׂΈiͨ1tm{թm.Je,祿{j[,\XSZ@t׀Ȥ\ ]ӔR=FIEvwn׈f66{HS^rJʇ<"ZwГW5eu hQJCʒ&S$s*Fa ϜKCGcT9jz=$ ]Q~u R:qxUGJ2]oysJ&+6Yli/^:fDy%cNsoPJ(R>ͭZFUAɛJj=8'*P SIIUUQd.ju}gt}D:Z7EߢֈSEW'qL &6#&tH/*m(- V\`5L)QR@~IX(ڌAk@PI{m:/[eR5$mHFu )^mh1s#Ou+Bkjb0;*D{j%xavYwiJeƳhBAj6Xv#1($xitvYTltCnr-SFa}Cm{PT65 (%ʭ:dyU糯Ҕ2d-M.8քF_6X uʺWHƅ%=2=sͅ/ E@Pi*0)7f=fZ{JȢMA.B혽kj*&@Pɔ  )Fh'x Jw* JiRT@F&2̆eBW{+I}BVFE2tfd,+" g3 VXm $fŊLiWNc YW܌T .n%W e ºuu 0H$ DrHc';uOQEg̥5g <[a1ffQ$Q 0RLp1U+Wlu( r3Xdc Xi ACM T4gvJ!]`cRU B!^[T#J6:-A!N) ƺuu3oUQkօ@}JIִ3ˍsT4;zeGVcjek8('lBZYYpcH0 hTwn-A'23r-/tw({:ELN9T *[PER; 'Q /~  1\+NW[sޜ5C-xtk`X#a-$t>%$CuP\D Yedb)t%9YT*#"1,yLEǜú :#.'T>$ Li0Q!Xl5tq;,~= {N0A)IHYY'k٭-x R܎#A,LTC֏YߨiqT#mC5YK!eD!v*"};%ZJn!'1O 枊 lBvwz`2 @فa^Zր' 8,h9@l;h55(!fMC|0CN(׃=VCԷYallV|=[*ϕ+v.:`gV0N;B`X RX8 T CWu`-q ?xڰ(RVЏ'#sJXZ#eˍLpRCY_85'LA p&}*H# Քqe oY#9]r5W%Pn5ʾwЪp k 6P z  wj`93#@Jf-th}/j@2bi%|VBO連 Pb.zR [% .D̯kst̆bUvyc٥`vJ&)EDSXrד4\N )w y]00t #;f|UGy&B].nT67@pPLJ%6C'5<.4ܥBpz/+C/I0o2L8s f&<3>>YD«E^f.,7N |⸷C]oVUK3e4LoO}%zc12J:N/'3.)?i>2is&Xt8Y-\ ꑙ{|6u=nO-Mm]G^x_ $*~hW^q[*pytI ܺ(Qگ}@03Y&`H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@:^%qBXC%q(.7QZ^ ((@|CJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%*<(vI U;J \@֋Wlh$)P dWҒ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H tJ :FtG pj{J Di)Q $"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ RzV2TSu^_V N,ラף" d$\ p ʃTK \띰웾l[BZv]+Dk͡tCDW BJw9xBd ])WK97]mXp`p3ٽՆh ]mr%d=Rt5)]Q;DW؋LuPNWrDWGDWFi]<M͇r 0fVvB]Y N}-L?_^w5.КC=ἑZk!v]z[?\ZAOr6kQ3+Q/y{M<ߨbyގƓ MΖ=/]3JQL.uzy]~yKƗBMnG iy֛\^o󭫅 7#ˋZUmTb: bW"eV֦͍s>[SЫE4p[x<`) x EjJCGG}k}e26*xk̈́P>} iL"9llg"9kuW"9DGrk0Ka;DWU3th8tBR]!])ťwZ9eg Zޙm=@9tBJcr C+}g+U;t(KЕQ.ϴ]!\ }t(-]#]ʹ+vpMg+D>i QRQҕһ.yWN+֡+ uGth7Vz { _{z[/ Z6J"݁ezDՇV\2jEMzFW٩󶙍:P'w-8{ۥHK}g"9+yW"9Dl3NFrkd [B- Q Ktut%^tЮ3tp ]!Zg':FҌ[k:DWxWWwƻBSDWHW}-v3 &%.%|>OOn|xv9ΓOmWsq~&.IxfrRs7)"CU6hv %o1c+o Azo??VPax'w-*s'%+2ըLFV ?^ѬǗF|glݙEݺ?O$kg'fâ~@k?;1L|umc nomb|SwOGm+9d-߮n;ۚsm77/3.\݃ *o=tdc󽈺,!/J|}ׯu'׃?jjm6qZ&`^g仒 樟? ϖWL|qnge?NfO#(0 u_= z8ΛV×WXp*l?]_}}bo0.웟۲x[aeIjЌggVqcY%0>M;>?ތՑu7rYMKe>UVuOILqzBNNŝ'ӇKBp%p'gᄔM3Z6\ 䛡1~a\_C\Ns jx{s`h/5j3/ ٷ!E>g%>|x}zց\̉r{/Fp ~oڤ0b{^[EA[DbINoOOeyGXʢ~%tt>^NG Jb, LMpZ .v5{~ԃ~eWMPymn`5wP}f95B/xtsU0ɶָsdэ%4z#mMZAԇI{p? }mɧ5˓ޘ(,vcbxG g_Wb2x^\˼&ˁj{VDY+BZRWt˄޻܇%N.8\.dm[lWi'b|;"O5>c[><.]>t-JBW׬bH9S0-,c5i|y{(ֱT0;sXS_8d]6u$0d@&I61Lڕ#wli3['<ʜ'bm.Dazz\)tGˋzQ;·CX*|9Tf(hJz|NSey|W<-yd1物m$9?% kgZU[䲩ۺdKdrͫE[&e-_ORL4( -C1y3"&"#>@:Cd %@]ZC=MgI4,,tGy >h $z=;s3xߣoL>L'=+D{km@78-zjf-T(K?Y|^~|Ėjv&mF[>S R POIL<*=KSYឭWg+ wW+*d*{p9!|)`z`[z iJh &z.}R+eP 4rA>"`D^V+9^?Ĵդt~L)3e4/JI߁BNMY@XL%gpa*Ȉ%NwulmyN0K"QSi8Q)Y,X0v͉-KFQAaRkjcƎYc\F@BTqAC&K JQJMY-1uceg&2Kl`Bꤊy xtGtᑜ:˲3z֚( ` 1 XI&.i4% ;H1sA'\bY w@1I3eq01bQ,/EuDEg9`8:n|ps^ 3G?ipTp*,x,l CD:iv.I!<_ fU8n\l*ϳ32ݶP{œ`Jⅹ0[]>(=4R!a4Xˈ!#QAix.%ne_q0p>(%i!S{4p2P AKL cm"P\3gW3g}u1~n&g?n3]4q<^¸0>AsҩV<7ƦJ|M`x`sY`^;3\osNa&>'pA:L`HL,8@5`YgI&poC@H\FZs&99S2;G2֗3g]\pC;=xʏF$ ˧Yc~Ѫ\,8r cd!yd4Z\!93g%T8l!S&Lg03Dy (t<{k S{Σ FZDrL2G$SG`/(2ewҙ6# A<;ph}wԞ?0[kYm%3eì҈Ln xea]8ŞӺ"ʞ|}،[LtW0Vu?tvrJ'|Jw,[aPa(hK;?dK(oV(?訋n>oflJjۄ涗y Q)+đ o$c2tEVƆ0щlѐt ܩ;N</"t'_h^E2j~"75!ML|/.8bT=^\q{_Be]w`b2Z|%kCL_c-47ip7#N'Jy3YQoL2+1`-f_Q}L鲵DD:,Nj? K;qۛ!߈Hdg|%94,>:O_)&[=nO8z|")ᐭGݕr-}1 [V>!+4ٿOgJGTA|iӺ-U_7g/Mu[}`o!فMU:scImwp1_u`sn5񴍆We_I"cBVS#7,8B*\!%msCHѼ>2vy~B fL" {Ӗj1Q1NOu U5 AH#ЧwQm\llQjF낸c^ }+Qj$x' iUaGջ8`%DSBC"ލU ӃM3!DYmx>zaGa}>vt~>{bdNz$JHP3 NbfJ ObJQRb0#QWVP @p1:& ;%5iоgN=v}VdKy'3EySR ZT"%kQA9ְ B:Ey˲w*}hd?=D 93W%#Y[}q1K @kpiP*p끍^lQQ/kDUꞁ3 jCQ}Kw7!w}Ukr};m a5[uW)kB«_a&??v+ď \_~X^_]z?k;R7Cs2E8Z,A_~k,S$cN۔]ކ4%u cD.&:M5]۶5Q<׶ân*8!6a"hy__ڏoeGO#6[/.xPv'cr6{G9kVy^r{k%Gzz9Y5^a\:0WSB)+^jxy1yJ& ּCC ZVFV!EAD^rǪo<]it&A<嶼%4Oķe^nwV?[]|ren%qRL)0Ǥ7unA Up %Hć󑏠~5̮q6ڡH&de\7S[HXc*!\4rJHadԜ!7se;e&^|ʾ;+>)y˔QeF|ds,tLYMط3Vr `~ p}hl? ?bl'r9VQր ::Z{('?f_?(&L0V(Gqf+E Sp4GZ ф\~Rŭ>fIECX,ϤbRϨclzd$-#|CB'*4Иfʇ# rlDI<bkښG,m $3d0ڝ\*>+ݺϓ$b@3:}{׌qݲ3NnY6 Λ-/h;r5%93Lv ~_.;oKZ`l!M{HM/괧,׻[zE{2ȵ{puHMwR;ι [F=OK]S[MkwU6^5S?1td8B;!owiu7gi\b׃řWZntGw+߬51kٝ|lj;ܭ}Xu桹E'ERX?>b 7tV4T Xf".vPBo}p}}QZLnʓD+ 'xpjc)іPA 1` B) tdП'F$tn$ݳf C]M=}mzMydF F"#8~9p, 9xIܤ*TLް䌳"P od9Ɗ[c!H]2j͘Xt-q3㜾 ؓ!sܖ9n)T?2G,3MhySzldꊩp $VXe0 :P9"'씰KG$D̏" Q$} Qz=*4qmu4*NDMps, Q䌞}Lrwd.ǖ?Mؗb} ђ P儋; 6LQ eã BHL@4e(:f\?{ǭJ/ݴMx) 0v  ^3ȗ,o{ft.#٭ nVU*@D"0 CҵȹZPӫYw04Ѱ5 CQX= ^@C[TF|4Xypeh Fisyc`.) -Ȩ/Q I*9KgR 'JL Us52Uaa5 ue,)\x5Cakyǣ GlQb @*qLdf+$ ER\ Y0cE+ l62 m&iu)`2bW#jI2ř!8aB?>B.%. 6HС [3&eX^*jo5rF=:-Fq'QhiIOOfa[2륵%)$ʭ)'[|Zt֯h e*: KY9'0ɐJ> 91Z⚌.2GK1u~B'Wwa^dY[zt &a rC NfRsC BҠ}xu 114I}1UA~ oa%0* B G2{N< 7HBg%*M f ?NkiVdBQBH4h3WbRdb.<tvEt9>TWvq,Lɗ8#=`|v:IIjƨRdg`@ȉJ.7;R߂$yѲVh{:O47I i8ų~g|q=G/&(]Z>BcFq?e4Bu 0;:] ܈æ]g' N?󵛥x4m p8l~;28ŝRsߊuߋ?\zzy)76oe[?h=]]9[dk-7MI8\՛\&<Oe/SEwӗ|~fO/>%0m$vv>zE2Aɕ~*\jjO/ liOޱ!@iplt7LuGsƩc4YsN~&g=G|A }΋|%INu>8X?Z 3΁E{2FO kYGgB^yMbmqA 6ػ>ua[WO/?RMl"'h`/#O`5eFzi.Da(-o?6IpOkL[|ktW bI6sv~@lEn ,iZ04 BAx tf ^Fq.G'b<`ݽcedn=?V0%`>#sa "cdZaG,,@)*;K89ޛpZBr|/}\EtK4j!+\-KBNκE9ȱE<:#} ̡ ]e7POCBIBMmo/GtA^7+<[ i6j)=[;?uQVAd6%M*\p'.x_T`FVy7VZ߄˳UUxO='IM]ɖ땢E!c4D[${-*]|a~Iwq2胑X9>k ^LCDSL-0#92e,0C`,JEMv.[˪VUpN*w}}e,)P"rH t]>3E*8&rf=Y %9Р݇}Յ_WAZoZlv {! &e]Ktt(筣/`+ ('խ@V:[hYd3/T[Kl6\f@nfL9/UڲcJHdy!HX nْ,k:RR7^\g*C0Ȣ+9WSQ (.vNy:%e4ޯ\Y+jiANټ>]UO7`sH߾nr~(ZQfb7SeՍւ3\݌\` W7VQ, nW0ծSO<*rPP+UW_#\i T>cu҇yĒڃ3|/$#v5a$L(8`gN~̦i$k?-Q-]vL t?۠E\b4u$釵# h1G膿q4/$mVM('XN![ӧtgSV7ZnAxjXJ*&LPy%y̼ѨV$ZG[f9-RΗ#'nT KRҰBJ VCL90 ?IywK`2 @F9t\ JDeB0lFk!Զ^Mg0RI9 #U9-N 3rCT# &{detBb-#`sdAS1b: H 9Ϡ2 &T#2 Oah]FhG+ʀHDGqdIe>A u$ ,w!,{󪇘{˜[)&kY,1'WBfA G"U1rxa8iUG{FL:F[/$`f(qRghˢRw ꎹ8?&,gW!x@jd+S eZt!0n܌j݌j_=~*oӋ V,p$g.;,7Lxȑc6H% 8ń |3OݞC/ hXDEGoofş׭S2+uknP>TLnA*ݤ=?;H! &˙Fm [m?A@ hP6BeS흲 5y~xݜLe~) ˕za6VZl.&*i +ׁTM"9HKRmqEHlC}΀N[BZK *˔C4d N(t`\fr]knq̼ lc(~J>#Y]5菡]qfTj]=Y-kl~{olPN~+-pR1&L> gE0(68}$S:n&!)`N`!˽K2̜B;,SJbFZ 0 eItz,(XI%,VNynL$ 72ς%ZOhaτ4/O)9V NWӳB}wm &،!zp8A Bc9e%^"eLOOMwUu̴t$ Txyb_@5ceT>L <(t`~tuGY (`|]s"O}4ECH&>l>"dIUL7p%E|qP7KRf0DUb`نRdڬSħToO,'٢U.oEţ !aJ.~ܧ<.ͬo"I-%}eS-+"UmhzP3uf2 7ɗW3:kr5g2×, o|׉2J$}S8MgR3-矛`c~ WK'cX7jYG_%wx)}46@xFnl\[8Φo.mrדTɧG ҹ0s1w E5\hwh6(vx_0hPkغnxdܠޠ9ԽaV{slGg'ui(*6QEE}sE l_Xvqoiץmx1:/͡^z䚕3>߾T[e@u`4xB)rh+ƞ[S.&2yqfAӋjim@wv,29K@a#XAA!6"Z/2JN鵢6/d::T3Ɓfx˩frƺEu> l|v3>=gw ˋՓNUsG1לeup٨x0!%T@IZJmZOQNvZˀWY]ic^[u0Fp[ϸ;vaaWmcכin6ߛp3ߤRuB+#΅q9u8w˶:mqW|cO{PvQl|쵲2ara^!XGX5u[mzM9`jP#ʘ]|[gvDFzc 'pk^~(̵A5`-'Rl9qJ+yn#16ڝ3zOm u@KLr* S+SYg=!)F/5!ap) yhat6mn7`[e(}&ކiY>?4Eg>[Y%`TJF'`c=s`#a,062K1|lfenN6,#2] `t:03,*͸Jb,A"A$A}d(`ܷad&YS >ɝpS~k1L# V* ȉQ\@C&P'av:niA°jHqpaS/8b$cJd15FDW¡wԮ !q.J`iG &6Ke!X" SYL[֧$sA"ṲL/_ Svz0~zկo__a^W% .(+:Y$ !*9.+=0ɛ{;易ĮAJ},e)~SU亾r.X:3ˊOWMŧ|ٯ+t8h(k1./nݓ~Q)ФTgupZyPZ/+yn..OnFKor,-S3Wwv ?;狲Q]BdF~e{]śCY(ab@AѰJ^ނ d3-˹pe엟_9М\h`ŝ>j!25oWñnMv57ft6WJI5|>3^o,λ|3UNE-<׿X4p]i:_\̞->3%e䃕@FuJw3,}UsSE Nc|G.}[Ϛy m0ۈXkڪ`^Qz&4b'Ė(շ|ۛKXEy U-~%ˤ咯`1]^N_RU-WBj ${Hcɥ(Ş4X;Rq!:b ""qgGJcQXр?qH uVXmT ,3vSHp$#7*!L軝VvΪIkC, +6^`|`1>Pt;NxMlFnWan4X}N#EՖ*n?6ЀI zw$q0OdAYwdAy*'[X`=Ar#;0>}B]uU) p K VѸ䓲*"S2#EDM7{,]-us1>ϟep: O^g/|+MձWV7㠍VG 5J\ y:;C5cUʎzypznlCJ]f%>O2!9YЎѠurIQh- D1R)/LG eZ30{-#?* ihFΞ;u¹ceosk\m(f`F\+Beon/FцɄ)JfZc׬Ɇ_~?O+~i&}NGolȉgTsV (Hnu!ZRz3D烨Xh"\fԙu,Op,n로Miy^XYO׵T&Y3@q)&z /7d*J"^Grfzh%d>sKdm!'{tIU2l=$ww ,my\nޠWr'utVâѢ H,ߣ4:VtސwzA6i}oo"V A|dh{שw wЃ.&'f&Ρuu^= ͎d=}4e-[vyxw{dZnp7 k9rWJA[f|~t` un&._n|aunZtЋnDTk~ƍq`9ֺkrpbԂFnܡsFϕ8wIՈ[x$o!%'S9-Ҥ8ƩGJ+Fs<:P{2 ϚFWڍ+I`}~ 4_{j"~v94Ϛ]7Bfe^e ۷l{sd0FJqO=/EkEf+QpV>ש1VYE ׃;6`#@R“?*:d>9%x qqSޓ9nC1E%7YYr,89r1X9hP !`L8 c6s1\gj{8_Kuw`].n/.{_.Re+E-)q߯zH)jY38f駪1'dHHD@Y6ٕ•\QĪQP]ʚRkcHa*]^9(Fi$RGVi(2-va0⧱4Yb_y1Qg8rJ]8I0YDkGPX:;j7swTz=7- 8ٳ1p Rd|zi˳wdBTKRJBq-rbGUYrnUI&li=v trF̐UMZ;H, Cn\vb~G[uۤKejr_0ٻؚ8 X$ #\IBQNWJAkX. A6wbkP{eĹxj06D0H=&2&^܏yvZoFǾ:vP{b[SZֱI$LVh.Ԙ;١(R&·T2J(0C EKA*HXGjR*ʮ"A5{M9ԷnoJ F3"℈"nbYuWNI-,e{51b-. e\WD$6&VGU53@nڃmR,һs7sGwoH^u\\T( +:bpqŭ߷x$Gɑ ARj <ƀ0ArCnc_<=za7?e/jݶ"Cb^(ZNg`ZrTJ2!mBIN'LOg5!TLSIN (TZXM>cWGqrT%~9 b&l ј`!;Ȑ|Z.١I0.B _Y,߰{CqoqBmR$VeT!jsPaʹ*Aa(2n|.Xvn;ŗ "լ(CƒSUA)H\x_قbIeRu~&2n|TDeL4H!149%(NJrng_ZH}@c :XTJGSZ-Lq¨SPC@R=]?>`z~_C49)hRB1(jj%,P !VAHpMT*'S4IQAl)SB&bkBh 'ǹm^HadbX?GXQt{[UEIg;eDI5!buh#V똒R &qLh+M/#"]IsZnCd_sϭo{1ڊZ˓H$bE\!ƨR΄=;FȾy$hS8tGNS%G"P S A=v=NO(,25;Z.?l `Ubp0DEM"ǽ}]'-ֱ{/~ZQ1p#d֥=)YD[kQER:iߊ0V[ F_:T2)cH%7M, wBߖ*C+q28#g:j3ѻ7t:}uttG~8ozw??ޝnXՋ843;]2Hn?ȁ7ka껓WDj@ݠ]uitdM;Mz"J]DhJc1;5Zhʷt 8?8c'g <@Qq.GXm19$pxV"A`u&GSX4v"D1b\'@q}Mwsҭ<ݩ/?G fíV&aP=Pbl t_/_捻go;s .䁀9e@Y=$,Vk!dCU>!`NYmڻ'55a֤sծ<"E:jBWsQ$QT޽zs IiEglٍ.$6%MܨS>w3_/O⤷ԣр۾{,M3'&x+F=Y#2 ݾbn픯_|#9-`y 3&`p1{#x_ kk:{0+Cb++_921/5?g[\12V$N>xALr.]z8n>w|Ww뤗NVɼ7b 7ˏo=&:"XE-3b _x :HFԳRpy~+Ͻ_ zFp U37sf-Cf͔W5\^n j,It!Rh7__oxaˮW޾w?ZvTLCTP (kK6/o覽T_ɕ_<~j㽁kc寳ѿ|?ǿZ9n:=/!^e,m7l;#LIZY%6jRmRjyb\U(!xev^#%2/[X҉ 4Խ%.A~0vS6b32)K%ֽҽ}Z; A'VY0F'nmQ;Il8N>2t6uؘ*v.Ƞ={W*:눈Mb`1(˦`J/sO~B_^[zBFS7E<7Ϯʹ[vޅ[wOSi)|d5m|(مjN4編~Ott0tM:ퟮӾvU[%$[8`c I׬RsVE]{ò޹?wD0+_o;8b:,"X&Yiw1/ o$/QgҏLd.ٛ!GGdnc-;d6G ɳpz,55ۿ (dL-Bj>\߸+/ DZoy~ o2 Xu B4,[Ԏd6xmhĿG#+rpj]7uKY~%FX۷TY%5O*wq~gϗU9xMuj_@cr PN?b('o Чk|^%CK_C֎'!rmVhl0lt(vz,n[헗yWFm/(grZnln{]/Z 7>ٖ뭏n?]N:շsu{.gp㮷%_k/+ ry¾'Q0S2m1Z3m®>_#8됉X4z[܀1)H87ehQlym)%T٪0bƢUc:Rbj]38r!a k&@[,0՘bkgm7 %*_&5vu3ؑs*|/`)1P_͏/WF||rnl'tƗ:7snls9Q\:'5#j_2!&STE赌 \v1)jcLZ~^+ SL:3#v$1%ũmGsɳ?n;ojyu9n)b,&8cp2j!0<4HYT&&]biy̡v*PQQ6D8\)\eKeХl,el 1VҎO/ؾ^9-$8 NC1ϐh#(L3qȠ?>k/c&NUJBI"X8Q\X-V4Rqh묧wB{AJO/myVB>&Dź$ةd+DrHuCPF#Fʬ"rneI&l%M 35A3zmAb͜W'kd싅|=5aXxN1 ˃;6-/jQbGj(X 5TF8 F)WvU )k̹Ǡ~vIqkxz|(m|싈qB drRLBlAWX)qjc2Z ]ʸ헛H 19: 75uH$m Z-~9#7$: ..{b*k셋ٻ6n&,u~s@˛kݡiRg7|pe[+VUڥ$gpp戱"vqF|rt_za J`s0<`I[Qs-9 T}.=mt>y`z<8YlQv#lǕj5JX%=Z&5:u R{gT \"8E@yg̤>8J9 Joɻ:Py ژji!@0‹Ls&BNX6Fs4όRrE,50(֚1 %)yb߰7qd)6˜IF)Z~(M{{۫BHxr&#YjFgDFI)82N`Dy H*$BeN-qC. :GS1U1B«2mN\$ &fd rPUfZ(2rnI##EHF>9Aw[}lT&ix|Xj[Y`V"jAky=nd[Caʰa?a]S^d5…BH5S1bZF٤ѕ9gs"Hwq,&r!d˓/<#n^?x]矗?ߌ۴nTZNo?Ng^t\4{vM:Mb7פi5VhUrU9wF^>\`vgמю!}:ug/{;;!5fh2 }ߗ&%%ZQcmn;/@$Ň|tm]d54?׸d,۽)MEbcZdad˒mč&1}xƶq^X35SuZ&Ƃu>ӢuvfvveWl5lNeX1fz= oc3M%z\+Ӓ4|s}ysvy:k"چJ4#21J "JBd&$Ȓ\s +wyu=q5$$7^aPXAe 4򬝕` !㰳<-nט#^|^ ,+/]D[`@zg=k}W%k}V¡g/Rj>d 7x;>QD Vx  Db_h؏16>P}؏#>X嬰3+җ ^R! LHP29;d8,0K ohEiJ Ƙ:hPۿ^f&$津)Cvн h,\#| \F˺{pRYXRDf"1^'NFMRFcF}V>:v@:6KcWe-hki4^<-@mGWB={\jOנR9(Ƽ+sR%pʵ k=Y[Cx`Hﹷ{;Jw*~ h:4[;W{zȕ?]q_i[>>\.euqJ"*c(Ryt:gU2nD+^6Pm\Hpi$֞۫i.v|+SUTQ4M+{bjO򯩑G:wuo~LO:EmS^0[J? 3;^Xie*s/a:>,vYחUg"Cz3.Jƻ27n~z4C薣ޣbi7S>{뫭4$CZT=PV?{imbĶ&^s@6˒M1 37Iڈ0l6x!r;Ldv#pƁu! CS:8 jR42&/qn$vJ1,lHs½q{7&-a+b{Roh1Ѩw{,/%1^}$j%7URX4?ATh kI*&WCS{m9hkA*=81%DOpDd$-z.dV&mK6*Ι` Pg#o{$)&'/W d (cP!=1 GyA~sߏ4ylT:@GT!ݤXf6}d!YȡXh8-%\+ҟS.pW:MRxtN0dӴT623( ,劣PhJgcrYFR>pJ$$ Df2Þgw>M?|l&/J?#= i<-Vv{ތJʧ-PffzZa3?|o>1alD'<# (cǼ\H CZhGjbX><`,r Ē7f0kd: PrDy}B4^,Lo1nU&H5fyD1Z+&h'Hwu :4 ̥1Ȳ0)6A +PU>y_p U7'WJcxiLO,ϗfy0R,{@.I10$U""౜ tNd"S 2__L5-g-P\@`! ƌ,)ѬZS6Q%%r޴ B=7zZtPMB o!dOx#^-ɼĮ[|fX]pגpVd-=O% ƁU%e@:`6,ތ߅[?]I;PMNh>4z˓LaqT;NVT 1j}7nך۵?HmBryXhO3]CFkDIku/1O} ~in~qp?_W86nl+i:YT(ޗޫ^ 6l%{|$ϗRacsEz{v]'m-'CcoWi[tIlDPaQk+_ݰ{bcW܃{u.PLCv^5Z)j}v]xxŀvw|2$ BhQ%.[&z,S6`VA;fNEht΍7?M7hlo+lqӯKecQ_Ë1xii[n5s맻'ea2M@C0T̮hÐwEϭVC'yd%Jh+AT(VI0Ǣk5ořhM-E#!M2 QȜ|bdpe=zg-N"*KȄ$kœ'oHM}ʒgK $ bTI(,<}4XM'Bl Κlڦ \kM\qX^TpCL63ZL⑽ f UZ8g )h,:1͜ga4ZT0gVl uWO.eG,2:[(c!"[)E!ȯ/!BI{N{eهZ}ڹ\+M#r!lBdAxF % %lbͿF̿"g/w 0L٘BCs<p#>0wѴ^R[vCHϗF ZUTeҥBfdeAJt}#MR([Pj(̸[L/&b[rx 4&0 7Lx#)cr9f)bBhBf:d# ѥȲR," \g@A`=GKN+6L榧umZxlmQY;;2=O=S{*,0,GGobH"#:ޣ3XViKƧ'ҼT"Y඲"j*T䡖5oaG8R'Xz̹&ST6& 9c:s¾F'N-3ĖUku|K7_<6~VOAJRQ̐=Pf-c1ϣ뽂u_oW_/2]_> U]l{@F*;_>]](U0BF|lIqfu2TNXy }gLykϭ؞6xnyAʲ:' tLK 1Rr^ԉh.GGlRNbج~Ӆ NtePIa=gKeS騐6 10ul0}b{̬,jn=e>,/\$yo},\\rb`1G.ƣjTFc敎FUIj`\S55?C)۪4dw*a_ns׿X]7>26P|ܹ,o\:yqr76!&DZ˘Qx'OZy@ST  9I:.-;S85qffޚ|Ϡ]0;[US W謭|Xhgx*P Фr>[4xԝF98!xU齭2:0 !=M, =PCd/>^Lwz-,dڶ.mZYݪap::uDNӡ]+0Z!Ƞ+V Z۾kߵrV8Z!B(*p ]J J-{z6tkv=f`t\*8 ]mmttms0ULg]sr*hm骠D1ҕ.+D]p[ޓJ4=r5מ YqY ϧXSgJ19OFjvi9Ιߣ*^:ylR2ryf:|3@__vZS7~26ܨl(Q|I1[֟_~GMxup{vJeVN SOհ5T~̫(tTWaޠ]}.r usqjwu*C]b0~ς/` 3<꟯}ur\A`XyB @xNFD/ gѳd٥g3~{!]v2Fx5 $UV8PS>P#G t'T*G5\AiwБ#WXUeW誠;]=]%]H]w \ͺBWmreLOWCW䄚NYW3X8-eҕVh]`YP ] vRK ؝3K< -i;]-=]!]Y+,DwWv&^*v*({( %ݡMWo3XPX=]uԚ]^2 ?CgS 90]mvߩ*PZ2Jm@WmkcoΘQHƀ=fv5M`#MghtA{)P1ҴACf?[)ɰ~ t˦jf,Sb.BQ;YR(]u_d0 ].Sӽ7 qU짭q  \ۙqЮk_Pr֏G8Kc+l$t \źBW֙ombzz˕Ov\]Va骠4#+e1!*%v  ZzgF[WHW*4Ctfu \BWl҂#+cT] ;S.aNWWB#]Ydti& \љP{A+[">vut(uUXW;R ʕx`OW]+f  mW8vC-d e JtmsmWwŷn[3nτx),Hn۷dxl:S%gYgJa]re`G%G6Y] ڨھ;XXfQ@妶/7:鶂L.8 gʈG˓VE 9tYw@Vq!U+gAuςr%t)wd3tUr*h%k;]z:BX ^mXag5+tUZv*({:B"eK`lCW.bWoddAvxHb<;zG).dZĒNրOW';7aQuӖ:NJ{T?UV'[3X_XK~d<ɩxIA΍R@b!%W Ǜ9k|x톞~ЅqB$rvNI\)2ì̷љEA{-CR`32PDƜIâ!OTI51qw~B'$1/'WE?HUN3Jxu8jXQCXpzG^a^_2>A Eyc52ǐ = x{۷o=ٕGcKh *VV&UAL+A?_'#zNyDo\-r]{B%v2Cqz8ﴟ^eI+EDŽy5>;=YZͣu:\b6:PܠBf&)\28g\EX< *Y>õTPńMSq>KVR=}gx%%BԮLem2!C!HD@ fS1c[¡' q;"1 [HMo_;yS}hD́M*9Cq]^Q=r돮Ms_? 4c cd7tsA'lH1#H.` 5F}L0J[)'4w/ipgYmٺȌ"XFv3LD 1/ oeHzXv)4y;$ȉ[0^hD&cAτȿ>Irdz5֫@GN?.\10FDJђ'6JD&g9#هp UI&;4(tcxƭp9!bLL ,AtIU_R/YZҗIUnco|q]_]ZӥdI ^H Fn8[4p=x6/So]mo9+>6_ fp`o"cMɑd;",j[Xv:AYfC>,2c՘XH/Ghy+/J|%N3B41}D/=Z,S^go( ͆ի] ]F8HbOVEIdΫ8.pO({m ٝZ-Y[6+NV[O|B`vZmIy[#8پFЋ -v{ .~Wҕ`z1"<Kl$}D1!+=w2.HzM[oOJސ0>BXeb7.O㨚a /1ClQU9<&zwrA 6l&Է7ߞ}_: ,J44mIӚdi!Op>;X r=ep>֨"mpe]%o\68QnC7ڐҞ>d+ vhLMS Lٰ/*Hge'R(TB8K#D锨P%F=z$lq/qnx3ysVa2yGZ:㇂j[Ey[={"ǑP ? qntAK(9<6(Nkg$F{0RRZ)n)>Rǭ[8G͐M#eoy6Q+B'0IL&,PcDiB J3.Y >pJF"( Ys$ɖ;j>fe+sCFrn1aCޘˡˎ)ڲoVHye*U-?48ڴER0)aWsS Xm0Cdp}bܱv0nG[~<)d#\)HTTo !S*2 0"&՜iI2F '76 ?eTͰ ^x9hzH+ o&DFpBꥊy3xjGa*6y(KȤ73QB"6YnQʛDiHK2hɨR̜:H#a9UL%BL!7F'FQ@Nr#,yPY\{9a/Z֒<,6NSxH$He K$E_୑׹`1K5C[ׅ y$~`^gܝ j> 1ǐW+AD-"kMsΜJC3J~psA!C4QIAELH jzD2m6By,fl`+Un|%*ϩʃ@5TH0:"lsx/a$3 4M #)q-;yPJpB&{r2,: *\"0.fΖSB}6j:\p7E)dz٢SgLjxE>oE<5}Z l2&!L9˂g2B`#N&}DKDm/Ï;eq"9r^ZX'4V輼yG}VyIBD TU;&V$LsrcwjOeyu8\1sX`ss::ҋ)"pE~|e9v`/"ƼX 7EO K> <z"X@pQn-"t9y'IH= lyD ]vF:> O942M&X Zؚ^2&" F9yAJ<ouZtp`>J3@,FK%xv횦9셇ۂGbds)Fˀ\МJyB8K}xEȜ6K͒7r#7"5;C4X6k閚|u=l=Up%Ux0;Cu;K~3q4^ f9{E CJXe?dK(V$?訋'?oze:tU.4/1yP)g䋕|YAZ9uҜAtV`FFg''Rp5X!M ^WEaIoM֐O]> {FOc55¨l|vݰoUW-Nm,r#y*FO)ӖW} }zwzXuwy\_ǣyVaӊyS[2ɬĀxdԻojzt|nUVY:7OgGvig߷wy}dtcbv;Qi6wi~ΎJevI6h4Āع[2],,ӫ)Ih6/Sc1ݳMda3EtgZ *uâҭXl4ep6\lR ^}Ԗy/7WbXs9$OLKd:tWwhA6xp);\B%F]G@ښ[tig^JJMrdXȫiTMz-! W.&HF)oS~*0bA$0#QMF;Зѐ:<(e1zE^n!}qv~v(i\Ӹi\~ .&0M]gMey%/BǮFĆxq\'o󯚳4w5I{m 8Z7 sft|OOZcAV9 cbHUZ@:_`\TmW>A᲼D\;HuBYh\핐Ȩ9e} `di@k7<&!s-R]'Ý",D$R(x^gEJ1%Tز#Ρ"<Kl$40|ԂE p"$JO5D]1sX# oTѼ|@t|y \~J:ܭfdwMO枆 :Z{)/&_?(&)r(:^<(a?E+G=8zZp T`pl4GqeTq'iy$HB`8J0խeQ' aWPTP񸐶,ֻ[oz]4S~nΕ) KA͝ ^(1+Z;۰1<2ۢBhڧiHC"p?~bX|'7}_) krԜs_ gUm:Y$ {rr҃wؐgSE9F=\>iv$Y9oOcu=x!Sj8fX.w9q F/sTUgc%7QϽ)Rzwr!ɶc|s{lʈNSLy Fv=O97O[rPa>?ңeZ=`k]Z;i=RrvksgC71"ԟ菋+EIQXKG];[k>y w`KNv,і^hos̫N:@%bwDq e&ʳ&\V{0cvvӘö,ZﬞELsDocP'# ,AeDdI@ZipLERzQ|ȱnQ p(QQ&_]r[zDUkex<C<Xts5hY Y]?XEqFh3=1UYTTUBXwxzy)EË 8NZTd^d x-αY^r,p%3jY/ ū$ 9IaRH$eou%Y۶4n3wdǪb*y t((#! 91r!̷-^ k]"-( aAX2\˥:sC5"F7cPĸgq2,k ug¨3%R);̂4@?ٓᛕVij(5kI[niAjE !ш^r\AҊI5\R9K3fi4>l8"ׁT0!{R))@|68XdzgBmP|O:e NJSx讠Xk5r:]p)>M{ST.B\+ .R߿ > L:}zҭ9 EOn"b Y`^%kڱ z(l9ϥZ{xsS[ZIkΘW#b^PVF+my [jebb%D .EEPi3T 21 rTFZjU-=GǞP48t.d˼f2C.ieK$byxVހ՚ϩYjdgiz4_cqӳjU<$C)Prࡹ1>=6r‰Dr1 h.*i%,)XU8bNXhʃRJߋR;,‰ HoA&}fqΐ&C~ ŘK'uL \dn7EOEfH::7#YNtGJ$<&V#voj^]/+g])Zl3`ev* `AIҏ,[ښYpRsw7UVR(EtB&a2 Q" 6TPL! AQdpo *#2 -fO*`)lK,`bVXN`Y%n|^f'?cR\k0s1Tr@Fr]]trc3E*x& jM+c{$c%H==([:Vh[3/jgP|"w Kkio!WC[5f+*m+> ;'+v/GNP^y;R{PIB|wФ6JfpOqXcWSVNp)2 eT9%)IxW lADbz.A).<UYPʉZ\:ˬTQ/շb!Amsq6Fuڧ*X;xkՐ8zXi.} ;7|n|W/D8+C%zn?aFrR(rD|Nf۪9z~(طL6'N<Rf>S1AL*wChwXH$ʆAƃhrhբݨh=VYXfـ)H!ER"L@sSA t`c4pX&J(-BAvx$&s՚[e(K+,o;f6ۋ'7WIZhy_vi)ai^q@O/KzeYjfpT{!ի˳OxH -1芴s]E\!evY]}D罨Xݏ5񖔙Ig2,$sm PO ̥~PwSFPOvm򝐢gܾ ~4v$̞Ț9H! !E[8T8ѝRQpez}L]Kx^fCZEFh鳣qyer?I/hVw.F>|5u+bc_R^~xgX g|}#Wn<\˒k kW/!ۚȍ $fuY,[Kb.zZWԺi][o~= goxKgmn G~!e5_ixoyOv]xB07ozw<vsX?zˋ'g'oε^oӦ"!3rt=3i fB0 QJA7`Pz@ z*WdGJypqeZ9vc&HvH^p0>(i1'TAtKE`Al l,дwؼF7 TcϞt8},z20&]fFqEv\0EYlGI\ R e IN],%r͖BɌ5I@vU d˿zyDHHJVpȹ]PlPGbZ t1>1j8R ȸ{hAShȬ$cGXṼ$y.>o2#ɇsֱDzE 59JE \5FL.YzIs s%}&j#g,d*aac+ꆱPXX'8*$on4<]Ճ?nPNjp#6t6Qi I+O&k)%'1Y"&Σ4.&r5b1| ٹ&h Z&#\* Yt(α2Ms;b~/-ڍqǮmFmӢvnLISuȒ"0y&$VAYE?b)dpV,َd!f%Y$dQ 2&K&C*HF5pc܎Q_2 0M!cWD #mEč^,Ş'e2BH^&J#qO©"H Ϋ QDt3$&JF\p&D@c. 2'Qtȹ?#Mu \N앯). 1c kqōy{ ه EA*m9j*T.D\ImEa ]7<8|_MApmD?:kha`r r:\ÅڇN|\ÅJ m/&װr D{7}w񧁫P l>\vzҫ5WGPB9"j9{*Trs+턘3x>,{K&\^q[;0!;L+8fO_^G㦉F}?+:d*{, cn?{ӆ}5^e5wB}~ cO@=OSw{ q݇iwރQ y c]?ZNV8 &Эyϋ+q@}|.r9 J$wg=W+?Z݋k«8zx&*'* WKY8s>dO;%!@OIhYN)`7Zv\C-RbWI c)H] 2Z #sRJmRQcN*-@U=Y(I&b([Ax9>HIuQjA9tJуd:*-[lD8ܗáĎIDĠ wB|wApPC掇"74gJƍ̬8l\:.0ej{8_!0Qns["9]Um1H8fz(Q68͙穪J)MLa5RZC 4ۋ!3*zw}yӬrPE0s(+d.h)rr&v[hdN!e-I!HB&Y"y715FGR)uY)=4HԉK־nhSi7MD#x]:h= .'tLH1d0CGK/l͇s5+s onruL3Tcvj|G u*RE.V'! IJIovh<eDbgr6T,!J22u6m⨖@C3+#ژF_?5wۧ+KLYTQ `z1..gTB4d)TUX|RNw޾B34錱8dK1x*bX |{7oQR`?#fJ?g7 ]6-6aT]$쌏E4Zܘ]T$sj Ef}g63}3AA~Mz Ϳf}ovYS:EG0 d/Ř/KZHE [w5k>^ҠwW6]m~ȭ>{w`MD6W yEs >\wo?# ﯮF>-`ka]Zz&el6l]ذs;l]8u[茓 q"kLw47Wi@7c~XJ4 zMs>K8Cl9Qߑm9dsǞs>.ρ2E v[hcJ ]tN{ ق]V@3c<bфrW],h}~R]y)bEi*Ht uT@H Rc GP)>I|p ) y1,7gq!~kf㿵>eҺv-'W3ڬ׋y#K/f>11U '\Tw\VqhK.>ncV;n>~q,G7s, 9r /r2sKTYYtѦѩdBk48R.x%y|tx2 ;w,hO(ᷙWi]>)EID&0499U^jg_i ꫫ}r$z,6Kip 5 CFaP#FrAs66&\mD$gtLN!%leFtVl:;r5ꪞՙ xx7u5f쏶^b,~Y5ey/:STQQXLKQPI2@R.ɂFI㥘/ <^`ٞ |,!2K,`|`] i)! 1֧Xi[5=u\}l&Y-="euϭ?$VV2XtY,5̊}%1-V`4NYĿ־>IS-M(UTª1i5:+*ԊfY\IR`H"2[Eڧ%-L(^/ %92a@@D|B 9T2ڝ ]nw6~+^j޿^09܃%z.ꇋ.~WRNjW=dUgOϢ;4K^wݿ?uG%6]ܯm#^͊k2on/ V]D}iDxz4\7'޸mhkI#ڷWIK۔2?|^M~oj'g,:??YHot|m_侃FOKWob_?]rW@*o&NW:}z7x|G~{x3ҌoY" iMt.{֨NF0Oc*7Jiϋc˖~'o:\0_Q+44d{ Nnci v-9W|ohB m\݆GzKƢ_l}+s۞I3v-łþzdWu̢~^2uWebw5,cumݛ9lZiKvM<2[dW?î(6Ur>7 ^$\c^kױBU7|ldig15j`[(jS" '#0JT@efOlPP)խ*̉=cF2V{~?FǑ7H;ICv\HN ]A8jM7e!^Y$tTM=:cҋ=a)yy[x{2>6v҉F(F;l]u.)ԔRxZ)5o"( $< *)>FH>mb (.btT)D-AcL;} Jd$ψ"; ]QCJQ 3`Ul"Tct^ӑ&+o)PHF+ED :f5#?M$g$B? :Jy"Hݥ= 'M9DRA勦QJ2$2πd2g2CF%(ۈXL#V ԇ`z^e( O*BD&A_R*QbQvˏ[5[1*b3*SC$& }()dIB{6!8IǎHF =}Qvt@[s/jgLۿ`M_\F%5*ݘ$rSVM%^)`}*%SuOݹ>Y:'-#EhuCY6h߬1J.m 2Jԩ+@[+3:&F)\ۉ^\⌱,k-(*g`VyK:v8 S=1[x͘%fK?i>c~ޗ/~/UsNy㥰Qe8袇U*&ZH5uOIN9L,ךZ}l!>{TTIPZeҠwwDFVe`PU zDlQAd$l)(!@H%Nfr#Φ~=Rd`4P..F$f/#2ZŦEҮ`;fݭB^à|z{uBϋNU1n܍Gg4dtIWVmOnގE Cv sTcUU3%Lzai_4fn{eZUob5,S19؇A۩M_ue^ T[LWpn'=\Ji{0 j -ͦџ͢_)Bo#Ń:ՍX1ay}Mدgڒ4rwKtnXfFZٱ෺w3ѧvo7߰i;ykvt|f&{]_+k*QM֎^PtzlInxȪ TiS׳ n>RzͣLFo>=?xIcO=!lxd3O>:'6P #7h8pa{E ݺ7듯n9< hn6tޚ ,yJ|}kN]:BZsE?L6%!z.$d;],\u|Ӂ'.[D`l9 'O܁G]r 6S-pPlESͨa:ǷlvC2lKLԳ{d LhE2.Q.N)!X򫅗e$Қl+@BI@כ%(1I< RHdP<$WV2+UބtkJqe]؋կsWs \+N򙘥Rp$tպoF9: ɣv PJ Hll)D(u.u8q>ekBɂQ2MYsYiʆ=s,ɟR;>?cs0x>&jBb_o/[~IŲmEx+ &Ѱ&1O*8N^8dJDc (ȖHXYCJ $ɬ$^iC0<[ˆË+jgٮvfb3,Ìcb֙Yv\loV )%d|~jˋڦЩUޡTQ IbENh"y>@P@3hcyUK \0/z,1+fJ1:tҚTHum;e}:*$c_[:d j o*2>[=SƟ0~M~or-D&JX&LLEKrJS댡"ワ-6C T^B%tf΃3*bmEEH&#](j:-vpbL:ھcOV`7VFp ZX)st׳B,L$U30MdZq_Dʈ#"x*4eKRDByr\L$h" e%cO0Y PB87RpjB ZqAD<)< Gs;"~zbu&f:kiɽpR͈#.2} @J^QoROJN)hȄIx@ \<CZwa?mgu&'"Uz]nw?TvLmlr,l[~~K̓:xW~6EN#q60عghd/y|{_0iAV׳e{BqvQ$S9GyyKG.vV%q~/}G /-i/E[ ]g}ˎekuX{kHUae]GRyU!1MWQM Vt8,*FtGyɸɑuAmBάD=H89!] 7JIPJx2lEuR+Z1ص;J+ V +V%뭍j(I$6 (%xY3r NꂲR N9E1͢0j'ViNq"L}jlᗉOY$t,8$|ud#GQ~[!x2B#iJq^E)ǐwAp#$)'%!-j'~,xC-@D VAl7{HF$%YpK]U;$ٜ)Cc YMX0Z0m 5("6[ۃ8[YVIoPT7902%㉵^$졢d j! ?eĀM -p?hÔ&9Kv"ň|!a_F" FJ8aMD5RjT/'(/|tTȬ7H%rA=kG?]m&MEͧqZlOc#gC]N?Lg+~]*+zv `Y=m,|OxcUr-X4|aM A%=0ZXVCHR)BR!t,fS}h|?m҂tB$/iu\SEF*R+<1 Œ<s:y Tˮ~SY6*S;KJQȩFL!MDH<0*ɍSyy֠=ƕtݥ͜>wbfdy[#{$ZE?>Y,_Kx۪c2 ZjFIP8m-12P7e{Q4Gg~f`-5&S6M!#\DU::) LjfI@!yh.Pw!c\k\V<eUF2vhZܾ;>r&jl{emܾ;8V=}%*O̘F\yeӑM3G +pF3YkCdĒ`i4'Mc&:ÿx.=Z!bJE$ FC^;nj~5wC\ZS#e.46?eTdڋ^H 8[fn[>wosYNdtPPRNR"LR:z$D27ZB SĠXÁ$JIKNmXMHveXn,<Tf fϑ.b < <ൕ޻ȃ8jQQXva,8rr>LF~F'#s>ZDcgP+8DdM6Q#CD:iv.I!<_7x[YPl>Yj o$A+sw̸08 s@6"Z E{yN#N#Jъ|k-8E=̪=!2{,hYr$Nh)5PM)PD< qn%aL$KxΧznl $It$AUI !=Ca5ӝĜi OvbZNKOQ[6u2nH1brʅN1gvc-VWHs&I gZ"j r(ƣldsNbvGTvErR'מjtD 2jH#4x.%VѺ[L J ^0-d1q . A?QfE ( <k1u5q;2(=o͕}LN@˾t*w釓Cd<}xvO~mѧM˦I0BӁȬa^;3\PosR!ZvY/! ?0xm DǔuҐ`X xuhrx&P+1@#`5g3%-CsT;**᪉Y܈#>3ﶹK-Fώsur3v`>,mQ f~1r #μT^2͌-\t.ƜX#o*#|>3=#<71A$[ LQ9B䂁ɩYöf%;p%6BOfˣ7|% r6?zϾzxùo0T4RSUv)aPn|ȒPn5^Fws1n ueBLlr}ٳi⨔#uxEv9hCh~>DiINOp6ύzKgW4OܯRo6;>g؜,cb9;=vRƋo+ub_,;J)[rq/O.qFpL_L[i6~4{*7ٴ9)ZPo>$l`%<gYA6uR@Ȫr?.f-gP,Ep?,obٴxiчi~ΆI50qQ's'0 cM>60B?9kg+P]&QHh/)'Tf$gvS-#q֑?y~"6"Aswa~pE1wuFj2 ]ЏCP8-pW1/4gC+u6ت:mcv]szE\@FO|Qy kTm#0{ jdzxJζ&[\aeJ TCmeZ]_Pm{}$.OWK.piPAٕipرL ЯnA~qc:/ݗt~rޓJ;OOrUKݮX0MPKeX&lg}JTWE' /Z/,UYwꮐof!j,J]t[n5{K'LJ7<]v)U&"10! I)mh.T1Dcn\<3gOx~Ξ<*xm E©THThGmK. m UzU@/kO".N%rX5ݸg0Y9rCBfT{UO8ŕRMЃ1#LܫջܘeFYĔN_tY rWy=,aSď f~xо8LJkwl7AJw+*ädRvQ$Sde >{R-e;+C%q?^^"uq^QB2Q%" h ,/z9pny,/V;x"'PV:jB4AɁzVw.E">HS;qœ89:kr yi8O .õ<9G? QUH%st;t#ӃqI5~@E~]s`y<ײ6-Fםfx"CoT-4ȏHka '=6 gP:ϩSQ4ˊv tMPx`V#EဌyPnn*jnTtp P9'[ .ՔM\c!nDƆyٔJF8"KD%ɥX,L-?xKuw1*خ&^|59j%!8YuGIRD2.\qKmyfr4[` `[y~p=[`_H:ۯ׏L0w)Y0"rS;"'ZD7tOxaӥvأ ,ÆX~Cpܰ9͋v2vٲ/}^WR294DQ9"g)yK%n- c1 \To$ց}2 Z]H1r6(yO!R:9?3:oG1 vCKG`k!Ӓ`7B ZpH  ӌCvDdVs3C[ m !k&$aά-NFy7>wScS<]rlƤLg4I^\r\Ĕ^SZ&nFx+4hJ&|K%zg.X9Kmk<}étoͦ1{~7^m]ZksgV|OX]\*v8+>,eS;~~'R`(0'j;CЏ-ŽX eQ'U/0+qEŗ+n]^0s WE4|R ³w}~<7{׽^͎vW*~Y77vrQN?˟xրvj=\l[LHK}1r4^D)%}?7Rkb\**d ;g8`p 7=p}v{/[Ҽqtt0u2Y XKLxfdJ:C)PΜ&@>Lnʧ[םSXO'yZ\>4vGF~tF0^r6`0(e:Qc2Se XUh8ܧ{7*T  *$ۨ?V`O6V-LdRsp{10Y7§,U/&!$H(x}o$NjlkRU5?eEgdw+a_fbٛuݨb[ڶBq+^Z}=bp5k6hV.< $$C43:9 828MrMr|cjFy-2@ۚ0d( >@hRfr>ZN=`r.ori2Y h?}5(d,6VCJ4"A mc}sϤ3&bTH+ǭz%XFy z-UX.mv+KT(ElRW8sx1& ɻ7 98zv)xK |qճ6 ᚭW,zNϷw=I[3-wLd=XI1J |xj"AXy, 2ͧ= a-^YL61*hх,aG#0؁`8_$BBC.q9uO4m"HV > o91&D R`N!f0kem RMKH{"@\2r0@ bt`'hN IHd\ qUqgfU{d΋XSo3/~}{?^˻/~\ǿD%c﷩"S 3%HyJ~H{a˽r4w?݁5,yv?-.^ cRxSx'msSbkoT\^oߪG}?'PTҖ?ɕ\gg7L](Όw|ߨuDNr\7oܿQ|r>Cx;Ӈ΋ht?=0w`QA%Xn}7bF?`)޿{p܀!' \z/Q Y￟&dg.sudػϽxtyHGޞv4Ye<ӿX4o9:MEu{p =nPjg%,Q?'S,rCWLP.\|i4w=fz L eu:B`嬴׫GYjrI x@YG<~͌Ii`|҉!& E\s4,uf2;|{){-(R܊qhh&)c 0f4!P ÊL"`n7 B9\ 1P0kob^ ݎz w H D<<qL3XZ"4HkBZw!H+)t!Ro(T% y js4.P0!z`FgK}WXP>)G* .aTdm24D ʺ=%[ա&z24%FR1ҵ޸w4Q:aXpL%\k)jedl}΅"琵iC5.wgr2Ť6G폐ʖw{ Q{#՚!nC:r-lc NwOUN=QWn^5Z ΅|( oYd60Cu s+a vh!8;[ѲB6dC>0D<J.(+8 9 LGdPhc"Q'o3Hh!bY#Tʆ`c4k!,|rB1/Y(K#0!.,6I9 #KZm 0.>^uz?z^+Ni?*Fe֢t4*g-^Ů9OT>2Qz7/3 w:$I3 >3'S߆Л:wb8jh)>qQ(SEPNv|>ǜ3j>+{;N"b@4`_!\pBFgBEIߪϑtPfvUzPM {X]xJx n6@b2z䊇 u12 `c[_o? 7p7ǀe_˒5s*kî焿V6&r%i>nW65ν^k^-zZ#h]?F֫tfMœ[5%ͻ]y:'ӗR^*4 ^TA˪5P UV8$$eʉcek#sz{Ŋʡ$JrQq*G΅FՊIFɳހwm4jk).\~YgE{z:ҹKq; Fc`vA;,Ʌr&<J(>Z'J1OPSIq26#Fۻ&E{f#MS@-Ҥb|aP{#m1;_3I2Rs$#]{J_4?qbjjи klc\@(@XES.+j 3jK `&WlH@ h 6a<~n\#^UVPTޡC6͖dL &hIU[? s;fb#nj H=j<"@Ҽu~MBqvPkJ'k 5Ca\eӌbaqgag:3~wHw\}Y!bϋGl[G*TզeRWV+cEk 6W {r8Az}K>b#:ޛ+pl7B?J;/嚋/=L;8qF[(_JL&X9^ 4'\6ئ. |W͕jod36dRQ=deԋ.bTSSD%k2)K b:au!]$t}9j*k yxz{(> D1pFT[sQW,PrɢNkTn7 XO=rYޞv!pljL%7JbPc`/8Lv}r<Βi7v|q ;*L,AQ\+R} gG;[o.ߣzyozRJ'G X\ʆ&dQ~H1H)&0I; X,l,Rʶަ bTJ!^$Jf5j'~wlMidgzTǦmVa!{RhHT+2ѹUF2 zd(vNf;~&&[#2Zo- -JqJ䀤6nGDetZ^{orS/ 61]2Bۄ)gNDM_Ap%I$BظXDl j<ƙ۫laT㏸ ڻ -KZl!zU__5q( sa l(.u֍P`:;=Y),\ux#~G K[z{u9v>?\u/t.eut9eg_dWbYC؜'bdUɋ(hH(p)omxǻxU]6E񧅬=׃R>u}VD?>5go^Że-=P~w:izwmv˗wp9rf|^ "bW uv]c_[F׬itT/D{u>^K: ߞ(dV t(NZ^'gCfX'uF8VzjWz <+jH # >Tl慀3&3FUljt36sus-bӍ6:aKQ=gS5*N[{;8ÎT) 1By;%*Ҝ 6dJ㞷f {{њ(%)UzIQɀ-&)P3h]4MF:hɱXUSe[m#Uf$ b(&H3Uok .S+'}<-}ne]|? ^ĭm{|.|s{ǻ7 NbfB|:? maR]TxzIn]&/BPq{)sRIF\x`|wr8o:6Vs5(sNB$R0{VNvo @. TuΈ>ȉyjT8i:ܣ匌5L[!]>ք7S1U;C Yk!zufcĦrIٗYD0DwHXu{2&LIeT_ %7+l85V_e%ⅣRifHPzZŋ#H$6٦lb8s:w<;=^,f~Fߙ#8GŪZ-$_)%5f9rH snؙ wR| v@sQ)ȾID[>C߉o{fޘ{[ ߘ/gޗ.l9[Wț"Jؠ(G[w)6?!?o3wgaΐ~7Tc0Œ_YpM\آq7ڸVp} ƹh?~ʑR񜪘BƆ ll"@(N͖IݝrF0͍47JI"t4tu~=쟟_ߡҼ7L1_bStmC;MiZSTr AH1ଽXf Xߐri?>os ƓbV AVRr&j_Ai((m"ܚ#J%FT!yKY{Au]W$$&C)5Ã0qv<8ͩL'ǿ}Y/Jo Z>GyK׸̀0N>SnS}ڞW~Ҟۇ߿_{:,6pru]|G dr;!Y,TZQ0#³ Bk0Ï I7AŔI)P )'+s/IEh>c3bf ǖѻ] ɩeن Me>0qvȒg3[ <թ2~LJ}-i;ttk.^67K-Ϸ|t~Ɓp>hE] !s!e8jBa1s|z^f =35aj0%)+*7(W&){k)jQxF!z*ʡp{axV:,@9C*,9WmsMBmQ^lì~/CC_Mjfw#C'C (R4ڳ; twU{pasp dBoile'Ƥ|h *FO1=ݬcLp Zk}YtxhَO?ջ"ς?Yo,Y"[wpZۢxk=G]]:s_6YɌF>^' RFJP۱ uqFyZl='w`zcU/܇yWsszaz0wg׃꟬l1>U`(p6ÝfKmSz۔)ǶqYY ~^m.omo(0o&ꡉ?a[x_zxS* N~LPo/@9qk`V;{~zBrm%x[NJѶE==|_W_VglRbiytx. my0+xYy=<6nϮÖ9on_{3E*ˣް2{ڌzXu`[Ocd7שՅj=Ew=7t77\OhrB՘wmkyr+g;:|ws1h>:{,X=7V K0`Z=N#\be]~ye&jϢE :2p~ן\_b)XmdE;zcېrijj+[) zC@k`iӪ}:ٗ3t㉥g3ߒs==V[Һ$WIsKeIgJ{ʭ_)k\K^.F0Ac&45U- Ivh^Js{ڈg~s̟7}-i?k^׽7+U7|u_/6%ECW+Q߭N?< po+L÷?.N͍ߩ񇇡vWA+\}JszyFfr7(z=)#8o]{p) >x- ZЅ_*X0ոfBs3t{n'~M'_vˋ=~vuԚlwծݯBcm!Z/ [xن|bRqq{}tպ^.KOW/.ru{DWOW ]uF?u(Yf*rKtxt`gׇǥ;U$CWwCy6U]AWz堮oO|H zkwm|7- ɡC2?~-^҂ބۯRaՂ`yݡ:,9zڍhxPZ,rq#Ǔ~r[]JbuRmw-=Je,Oەä-DW_s폎)맛+Ae.Pl7:O3@=Dg!JSK[,YUss(zvH7(oI nh({n( uފ#>?t¼/tNW@ zte!X:6{tJAWϐT퓺ꀭ ]u ]uɛA 4#]YveIWhytQrt !t=wxp. HR{GUG]}3t%tzIJwz\#\VJWwD+~GJ] ݞ6z p ]'OW@y堫o+eH/ns oxoG(S_'K>=9r<{\#]mZƝ@Κ`-(X^6[n.1_}G&]⛽x7/V'_*U4y7o7:]r5~# i-ľwm;{aVˣRhu*l \}KQ Q.&QİU^^KrAXjPl BmZRѹݏEqsTlCSD4R}oϢivNX+"T{ɫPk(T2#(4QkʴbK1$YD&w zoa*Z%ic]Rݤ:RJBDh{,tbش $ۀ1)Jcř3UZN1u JZ뛢Z&J4؄T*ʡ!(dXD.Q Ci >劁ƬKk޵,l%_ .XD!= N p$wm=s䝏Y,-m* Q!E>%V2X IϪs͉(UUhS3S$1TN^sPл1BDk}GZGH}5f)N ]9MDvЗ };p`Hj Ni( O2lr6UFFAoU͸" 3,l,` 0k*RM#hdGBǪ`+٥$ՐeOh\ f se[Da[PQBѩ m>p橇0mgMV^42n0P9"PyUɖޕ2A[ <C8jdi ,&Z]qk XǺg*){M A@p)ON0G␔YlXJKmx_uFdҐK 2 8s)O-`C"\5"U1+VQLu`6Ͱ  L'T, |佂TTKsv`2Wv9Pgžu. >o2okL&4e_ f gTtg"(J892r$XY{Ɂ 2!}֩RAPSNhZ.UcmL9MųN k!H)d"FDg'͵"`z `a偾Sa^|$/U4Cݚ;7C6d m*^X:7P*y*Pe 7YrfABHMvur;<+Eq>e[}%0+tmi,"z4ԥr1n2bVoDE. /r!-& 7A~,Ck5kq!ɑkǂ@AD ϒ]IPS`6dV( @0bdMC* <^a:EȒ,Bq`m4BgV)e ! "7WV_ 2E}LU,15B [ۉ,Pk.otXWHgYtg&kdUՇVM`m@\ 7%[ K.™6 20߬vB2AH@>of\!)HOm-Ѧi`%nnhYۜ-\ߦ=;ZKεݙD1@ŀ š4&Z63L!lt {G(H9 ,FnY{[0Q!e%›A)qDϟOзUfu(5x7LJxKdl0RbH~(CG؍ZR L3NJd`Lv  )cY琐'kRn҃Vz KgcP >[6 XW4U\!>jHj'c~{Aʟu׻(/VSCuE#TYࣺǡ9!YYp` <ڔ.m(\CÁ7 8137O+~RfzP**-AgtLS5dĐ$jd-ڤtZ]4Kx"j+ A:ZAiA¬X k̀|"2~_O4%0ԤLJKF55S`=GCXPRₑd 74 AaĔ *5^/!b!}N1͚~N9"81q`IWrl$dip˰Rˏ]AqΘ  R G^AOYܴZ_;(]FӒsMKxȲ Ɣ W1ʠ3l]zE~ 2He3uy~Be!~_O?w ګJ VM%Pa_*:ZGOfT=J &hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJQ 4*F%з[ -O@1?@+~o*kׅP j ,+MF%Ш@hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJQ 4*F%Ш@hTJ[ d}&S%2M%P n ,*>zHJuU tc.00bד6bp5z^S󦅠݀i^9$߿8]o.)HuY*ԺvC^񺡴Y<)iz_\y ^MZ"b`~]K_\ɕ]*-G Q>nZ3kK7]^?niwjblu\wmB^ZnKbnzo0x@ܿ; »'\bq{o[m#7qMIگBqrJ˦^!s'kП4~ [1s =rUϨ-Ea o0ݩ‡4ʞTS#ڽ~FKy82Җ&5{?{ȍeI$/Lȇˤ>ce- dK%ciHs]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$]L$}&iPu$s;&i 4&iD)y1I0I'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq'Pq}NS5< eRń;>0ϻhv TY]2.ek!jq)ժ+%D+eKIq|E*O.3t|6պtQS Sʈ]ev.jUF+h*,>KTNbpw2Z)NW.X#]IF  Xw2\cBWQ tQ ]])& % U+:jhk;]e:CLK;DWyNtjBؙUF 9ҕJt`ȆU+eW*=vkx=]eFj芾Nj/XOLWzZ P*.{-tuhSL*V3tZ ֨Yҕzms_ n =l5Y4*ŁmntP%LW{D B/?mo2 4X9sDEsX)B+A$gXv6%usM(݌d l sF{+So<{'rgV"e*|ܣ~ewYt\ >I02Wo{kaX61d~rHs~oHthdP4z#I"Vse`PܐKՙ!ռ+qD=n( ڰ6'O"WUF[/s3B ]#]q.]!`3t2 ]e*U^}SM:CWkj~(9)tut%WvD¥JtV\%"ϑRo:At;+Dg+D Uΐt0UkdW O(%-tute0Tt+Hg fNW* ]i9=]X|>ћPmL]BWV=ULڴGh>C`O!S:C M#Z~BbsiC=di[h< _E 9_ӎ c䅙beXYo^&7rzraqY&jް+-saQb z5hεr /2*HؐBM3իjv]\j(5iVl;W%YSZ3+a܊WU`Tr|jYȵ{b*'Jn̩}&) k芢h]dEs4K  ]e3th5i}&d%stŵ2KtSe/UFKtut% ] s<{j^h>#tM D:C ذU32ZNWe*t&tRPB] ])]+DKex"JE3+C%]U+UW*mz ^XpASlN^p59q}?Jm= ]ZZQI;DW|zmBWHW @ =j`Zмn>֚-iu .s=pHYsj%BⓨM]m~ಡ[Q'7TB;URWB6KvV73ϻo;dԙ~v?/PVDSeXQm$\|ܟ+_p~qr5>*28uySw:~pG /k< qb..=Vy0< Z=_?OnvI#%WOs`\}?mu%?TeFp˹JMGt2* y O[&wr#V8GWxkNpXxw97 N?+˄GEz]ІH \Tῼ3&H}2 Y&ҞN\/yٵ&VNܟaS[lΛ5Jl7ߵ>v4Q~=Yo{7G;Z] *FYV2}4|Tl;?w_n:(jH,wτ!HrT҇UL$АjRPs `RXfA1x`rj=kX5y-, 7ۮWpG8A}8ߍj"p~k 6eMIoF`|pF{gyp!q(6}0:i0ݎk_6Yn:mySjA^^Q V/0e.Cz[<GQPi0LVv8Z?_ŵ8(C.<ާ8<`~!4\ ksqk=Yea44Z,cvۛݖ Nm"Qy0F*wcydQ8NñAf7M`km%:Q1bGZ+Rf;JqԘRr|]W-}\B;g8J|ԯ\['cWQĪÿ<`}"4 >qu'-<+d C X˂LxDE 3ұ=`hq C). v!^\&X΢DzE |f >f259U;=(gwaprIy6ө^=%ITL;.imL tD(cQc2γEcW$I#j}y==Oo_>߾ρM]w>6穡 eQ. zz~F^sZ Qp/)|}oh,^}n)DUFSEW+՗}R=RUzkp|W]@dU±dsן^M]7sx@Z!kE[zviTl|tͪ&q09RP%B21)*<`up 8xV&v&xsny :zQOy0m[@[$̉$L%Wi{s;^gW&9xpXyBq9^O ^@|G QkKel ֦v(݃^A&Jƴ'ӝ:2eJ; wѤHRE]^UYͩ-xk //rEƒ*uC/Vn=⮅׮.C4hMu .O.H*94aP+ 1Ew5Uw u]F;ULEm1 A7:$MHYm(Df}$,aal+e7?BHtz|h. Zm8k arVx8_M7!.3F〕0N.E]N;orh\݋Kh&`F`y2ƒqAΨC(r%aAz=n`hf<̈́RH#Z>j(Cd(jg.p_i sCSShd0JQ&S|8$O"P"3d]`( ѳ$ 3=MZ "ԥYX-PGb!d=*:4SyN8iPM"PeMtv<_ ʫb>֦>F'"Eҵ޸{4U:K@%\k)*}E}l}]"S3H|.pWnOtiҁ@8F2fM+ɖG⪁ Q{#+5TW9X_"DkϐP+〨uRs%'!dUhc"1My,J]lrր5IlX6煫>UO!_qt|yVf { `:mdw`@GMȕ7. "kǮWLV&#bh9 {K6<1?_ n nnۻNntv8ɘݻ6z^6gZϫa$[x4S7I vݣt<ۿRaWzbi?tWG!6R/'OWgy*V0# 奸r}./ oIʪo"˼Q2BcMc/1P8-qX<@ db`#g&Ir>3"FDd΂ͪ䓎}N HA%!rFnhZ15bn Cj6Ϙ'`^i_pTZH2pGkQyF`洎kL:9Fǹvsz= t.!(ύ*GkL"vg=>.{W9yc^kY`N=OBZc2iNBMZ 6(!=25yơ/tGL\r`ڑf1'dt+Tfi,Q $A Xơ];ڱ0x>ZEbLh/o_OEOef(:);*ZlPJ5֍O.W *f&."dKH /%"*Q(AZZ a93*8pi=.B崋{;sȸkUUU*YI2 K͓I9Zؼ-ll.M p:hWu%&Cq ؔ%%MтIAw9+o0i$jKj쐅V& }e, u/ T N/ 2PNk$NMnAFN''.9fJ# ERgȹ]Ju"}%,m/{Պ%iXٳA*^&*A#vOAFe9  PU":Fn$wYW%D.r&B6\rMX;[r5rn!Mu 8*K.#*E^.n B[\*Жٜ! V>p[u")\|x(Wpߝ_1 {?>Qȇ)=FQh^qlg{43ה*MZ5V!0:ƾϴr W?}_c.|u4H y͢PI >{i7+ULZx !;{|v`}Ww VqMSN.?c:>CO+)4'moR)E~70xlE,ٿGo5zo|KBʬU+׎^L?Xp3nS`<;z$9[ggfUTn 3ʽh⡷/>f\mpI39=!;'mPqPN_{ݟ5Q`3Wel4p>fF_:7M67f??&uzᴛ{Xb}놫J)&Vvf#%I:-m -}5a̽@ F467P5,(.W m.^"t&Qw;!gy]][O>k)R]$6JOn#bMkbkfsmrprM;',g4}i0=\w7~ _d-ҖO}X4O=re4E1Lu\k9[`R_/ WSK܀,iZ04wPjsQg&$Tb9yesÛKOtaU<8CZVa[I'l`iVѳWwMS͏RhɆp` !E;RU[Q}6|+1oyy&"}<gź6⧝R=RLv`Ddts85˝R. A$'0dT%Q\FEĂ#T֗~җ}vn֞yCzom dCAza[Ey;={$Pt2zM=R9ZT@ Jۘ`={Rkծ3TF:efLS{\INi[eY9Td9]pے!-:yw!) Ȭka2]QۏZgrj]"pЍ=WΕWnoa?`ƺ3ϚA3Ϫ8̮>^B.qn?+E{q*3ؙrwyYg]&o~"PGvDRb#8}{Q+XjֳQɗTA|RKKAxY /Efώ:|3GZ+](H0o ݻCgMׁ涝-Űun{Kc{lIw .OlAٝZtzfz1N:{D/*mE<(j{ivxAxtIry<=^VzM7JYl3FHJn-M`H98m,6"iQy%9z_ڐJǍ \ȬAѣIsf\4:U%&WcҾm>ڒb a*ǹ iKkF%HvBi&7#ɊsڭrC))2FicW9w!jłUrO)E ]&Ӯ`bɶ)-#{έƹ q(zr a 6l}b7nv<:&XVƳ J^@|Y%ChLjUa"+Ec`#hKXgّ//eB.]dGD,*L2Q@TJQy,,P.i{AN?vqBGr(V bXUfKd-ZIRC\=Ԫדz=JLHlA±(s :)Jۤ F.3D{QEpT$w5D"7CUEK(q> !X&TpfHs=(]L hUĭBل=ճm%/胩]m|P#ԤL% rFSˌ14\JQ,ig*"az 0Rj+d_\I9Ťlџm.3EMI36@8z߸G+; (\vV݄|~2v"xphT01#4R 8(k@c c DGBXX3W^#iqD9}`~YJLM$ m%Rcb lˊ묲V bwژE ?aa2ж9C[͚>?,3e X=|$Yst9lV2v/%6Yΐh?jb@(o\tj/ApG=ËH3e-P K:rÑ+H gj_aen6_\5uuLզMe6eMQFIvfvkuS/m&e",*Hf cV ZɌ :MK(Kn~d ,E,95EmT*8W۲ri'dK)1 SW}{V> ".7ߞgwj{x~o |gm5Bb`+"sy]>ÕJy)vpT{Xf}0a/s{=11xmi X 3g \!- X'4Vh̰ Gh*$^MwVZ\% 40b0RLLI`TT#pum?\kpjS)B3fgDU,3&?֫smɌw< jx#l +'bKʕR$xB8Ry4hQĢ@=B,E%Tp쉧OfCO=[g<6(X0%cqpE< bBK.X|aLI<`{iyP]w~^F➂ف#U4m!$r^om(ufE!yd,UQ0W}kN,zّE_-ֻŖ5[WhCq@e)P-A4@ǿȖ6Hozti yU(UKB\ͩc@hTg14tF̓ \_or2>j/_r|0aTi,[@эp`D-qTe02^M Ԏ:Qe1L/#zAm^h^"`OưM[[Tfuwt7$g|lrlk[][eJeG;5"Zؐv&_-=g?>XG=U#e 2o].wAWU7[&n?⿖e"/6o߱K/Y n 2' UmӲqfzx#fOOyt9&l7 Z^~1'[qΧVMY=t_yҌ_jiŲf#6fWb;g98{|:{ә*r[ZPc/ `NExlATPZBDv,Zȼh>)Li%صRy#-x`;0t/a-â:p`CTY3 Eީp͍ȖS=uza* %<&Пԟ>)iղOI/ٶ -Մ2ObC00 Vq$&TkxT QԄb\xH@pZUdA\qV ՜vEsi@H#O1ݧ\Ft -f.Ȑvȶ<޷b.e=8b&20=\`%*byRєXt ӣ, \}Tea V;g+x#PmTaJJK.m4ס(jy$O|:ƒULR3F6c@"x, * /},Pӯ|tCYjc&ZK<+40+8x,ْZ0L!]Bxګ W_MZs`߳*w#1wǜAcnX+ss_aTE_%~|ۆ;(п㞿ɗg\oO?a_n|{eF4kY ELȪ"zêuVmRw|UN羽 WPepK>,8mZ[׶;.y5,oapmeVܰ`~gw֚?_K/ Z^ͮz2^}CsX)5DYb}j=׳({6uP?Bt^)LyK5G9e |:eO|FB^H_O9cFP]OV|)JE>\ g+~~A0?|μAn PX.撄^U2=yM? kSyKBXF-r\ oޝ2 ##e.M,zN>;n*^+kh(i̝LT'7ٖހp9MEo@t]o@Qo`(\6+u*th:]!J{:B*a+m=td3t(JX!$KLn:thEȈRՖB$CWW%Bx eo %]iBNjG\BW"=]!]µT 6,cd"a]+@iuJ+,T:5|Ę-yW;Z ^fk{&0G@M~ &@l<9it9Qr7~>񹒚X\p&<H4|}hu&9 %"HX 'jtW5p>Ƭ),(_aj/箲Pgg.mXO{f_cw~mV3{\f 1{v]fs-hژ2V*Zg?[cqզ춗OlVrcjRc=ժɁSgzm{!&1 q*k#M'w~!kUSR%Q` 8fhՁ6C; x(UY2Bd!] r=Q'O;$0ʷVҚWE}k$%0^HJS֗RbwlҖekV:Wk)VWx<˽;kρt54;z{Jim-q6wZ&Mpn.WF6[%ؕxγ5;޶>Tvؙ}Rr>W:k)tiǫ5X޻ΰ唭 {dy"WR.311ᇄ,9|nf3-#8#Ȅ +˓+9D[+D:(9J0n_ԏ\LH+M*thu+ ҕdҔA1ſw%=]!])NLL$CWWT *uB?ut$6)LK=Tf(pm%OM 7$c "ZyBLtute%%DW׶ dAD;Bw5EO/t .;pfh? ]5Cٵr}]鞮v=zUBjQ ]!CkR螮(A3}j ؁_~[fS%<8Q>M,zײ"Nv8uRG@%^YCCIs,wgU$v`&4!k% Λ9kE"z]!`WkS+D{4Pjҕ$]`vpu2thپ<%c=]!]Iň Ne8$]+DLOWGHWJU%5ӡ+̵NMFww(J+CM hJ1(`+)e6LT ry PJz:BZR$< ]!Z%NWRtz9%hi[l(5J3cteЕjWSZ"ksjω8[㵛*ER)Ҭ_F$St+ϘUgZ>>3̈́lT/T 4,juM@KǰHq۞% 8۲.5XrP'oޒ{KiM=xepqu;?jU'Jz6J;^!$IVOE.}w~>ɠ4R]+z 2+r1R+C-塻+C%]Dw%U:IH 1C8`lQJlL[^s0 Zc։6Y'$5h AB*zۆ Ү h!Z@&U[Ri *)j5&5BEYPN%%ŜSZ \ 0k?J >-@֤>e~ʬ ~W#CMátp4)m{O|ܶ`V̗Ͽh9{ZY,qZ: -klT sG `LOيf+U+zzVBV @V aj ۡIQ +,aQx[w΢qW\bqW@-%G؆J6_bj6 \F㮘 cqWZ2Tꄮ^`]D  TJKtW0UD,h4ʐ+y,PT뽨'w(JI9]s ȕ<tԪP{Wz9JQ]E<'\ޕVPqrW/]iJ"!Xs2jbqWZCwW >wErynQN'-sMzVfR?!+CZ<}D$O]Ǜ-K.}&oj7AUNܕTkeFЀJޓ k%3|V˟1Ez—֩}P|<Iʱx ޣ+ź!7RSB,HX4aaK$go<*"NJ(#ĴOnl<r7JӋtWZhbrW@Ӗ&WhܕOnWIŸ++߈' ܛ:޼aV\puz^[ /9hpǓWZl+~߆Ď7]|q4+,sB>8*f ^\ ^὏M7h' oY̏@rsx# 0rdMŅ;/d<ϊjK%WDgvΫȗ5aS7mq9TI~uV-HH}^^JGշ٤0_fw"Y;ߗf cn*ut`Q_ IxBZ"|[ _?1c XÀ:fy7_f4Lr8Ku.G#xBB.R wJ)lU=fq͈RHՈcBnX,j|~>ukצ4Ffr~;ZVa63Rk<0_w\uԹX|~y̰j}A!d9Le7\D7 BT4_UORd 71 xb OCOY~QN݄xk$B v.5ׁM~_zY#9;M-Tr$=Fzj_^.oqQ#T%@^ d揹YMQ&`{1Gp XMHc⋌qeHFbm[wv@U C,7ZGM?E2Uj1kJ2Jbn*)o=Lq?x=kPLE?ǗYڃxGD]J*m8<ڑHP]I0LfD҉CK&B9>iFݗW4gc3b0n$GC'^rR#TB\v8iIsr6MzUCwCܱϳ|ل`-RWlF1eczWd %a"va)|(&7ij/o`>Jbpur@dضn3V 9F5CJ _ED*'<0t7FO5&lF_:>i㡹P' EHp|jxT| Iė:BFf9WTK]@YT9oV7/[R$֋a+X cL-S$Qq-MMuK]и{~%h *>_mb%|eȳ! XHBêS:-8fB9fؑ/ljf?tߺrLOu{3!p洟woz}0Oy_o.>˦?Z(|O>Ai(.j\5 @pLT4MRQ55mDۈ\IngjY"tKAJZ7-.Y6մ.I$LjWB ,h+JZNwǷXV[dcRlglA))ɖ ^ò7C*A}mN"\DS ؏1vMO)XGHl*%YelL8IBX9u\|5e>uSj!͒uhEYӽn#cs\21FPcP%dEᒟUyMm*p1@pKɴXt4~c h:&v|`^K{Iw\^*Rded'dXvY;O5lgUWeR8,qh(ƉY}J1¼@}/E| fu~ƺ[qqP4f%y&M7?4!/12Jc^*aď~ڋeXd)NUqL6%Ip8f4Z4YHGIΏEfs:xIT8F4%dR(Cil+B(1t)9pC,'3"|h.:čtf,i.J8'T"1R͇8`Y@M}ְ1Bjq%+PQJVW8aϜD>cwGe":UьpwOȌ%˾uL]sĂ1⤛!cqNm&"h_w-^0MK*1&ex#IѠ/j(g~j;❷fuL8PxCޅII9L#V@LVA l񹑬 +-?IDJ+f?XG%Wḓs]?zq݅8&?wr*+c~gT/)+R9wûB;xBBiЈb) {܏ʣcB,uL&2m+\fM5e|]JVާF}LPvzt]D4u kմDXSE](yHU-a%,@b]k CxrFnL!AټX/p3f}1ߟp1юŬUߎgGf2Nޗfre,VӃzYJ a18x>g׏L(G0_GaADs ߃7ף w25 0Oqn : l\hWh D!N0{ )lUG!adecɭKaV$ҏ1΃ 8sLT4BIYT/¸G T9]gŇ`_|nDo׷*fř%rzo0m;o8⿁aINM+ފ`&9Rtn3r3Q _HT1%KcmwmJ{vx,p3μ38)1qڷXJί_Jc"2E3vBVXůUEnJw~A+!3x .l[rEFl<[6P⤌gCz;ߊ(%B/n;G~B4s #3bH)DC8Y8SpgP,V{YM<ڒv)vԳpy%]nʋ;l?-N2.km%do 洒rc/- $EDiCb~1(4-;lt ByַDnx/Wna$YtՎeJێGӳuUxqX}B= iB8;,B:\Z[Ctm/մڧ⭶lm˞9DkM/ڱַ"!Z߄|A9Y ϹVKY[2&:e\zb kCކ \ffTMӗAZ1WLl2'`<dR|7"t΢ݎFZ#3w@<0YF{; X Yk1pv(\'}.[z>ww RK]*+Lfi oefWHAP_~`i[~g~ 0K ^)Syڞ~Q bۖ:Jjcв; l@=E vӪO,ZYJSbwK!)nXY{KGFAI\bMa [(<MY(7}8g (#By^M^E[Ap ~Oka6LX:n RiGK5RXJ P{ >*)10v0`u\}͘LUAqYRUA0d%4QNLP.NvNG,WMAGaP#GH񔳼?iL,TIS^+Ē(zY!\IlK :5Z0Y5H*r,Qhs9%U #~b"]0Bd07(;,o~iRL:AUD}7)B Լ06m|>QJy^ 1*~ڔ!-rsÄIfaT·?YcBFkvCF‡pRoM;psg6h2x۱ܡ~K 篟./EQ_([uџ}.a緻xr_fL=0 uR4 H c6 ֆj7S~+4yqN@B'z,SB)d!  $T&df€q"jW;`)%+p̋l5ݰrrsIJ PXvWI'OIYR;7YϔEWhUYSϹ)!{eA*0R@QbfRqRv0Rz3N #= 11tܙ4ΤFN5LL!@d9>l} y+WQ”njO)d806sYLZ% jmư5w{5̰SbsbZ<iK;u\, + Lb?@Jg"iJQrХ|פ3N)ؚC5u`Qr5P ._6:ՒaOc zMз>R]ǹ節^ǘW>az 0ġ̭Hc:\X3Mkrכu 1|jlF)u//XR\+_YdL?N&?ڈ"F4aU-j2g4ZCNK-جIgpbQ'>A:,"_O?lu''-xbl4-3@m,zTBu.*١WWG(g}yE^^nl8sEGla+UQzޞZL'N $@f("s&q ESuO|)Գ! ]6quUоۡs\!e[ڸr!K5vJ`uDʆ9-1z2K l~*:b32),YH ʹ 9v&mn[M&ֺ "O'TY Pٗ`-@q cem cmX^jx̷['x1ֱˌ_]}&tp0$8 0MGtm?KNx0H IDOؚJ'8=6mg^;k3ŵn&/Rℾ,EL9;%Eei:9MZi .3Sb0bjw/?ɫ5R@N^uRr@G1'i/B6yq $Ox>v [6qS`JݫЕ._Ϻyd^Wᯯ|!pǎŪRRq٠᳧~%^qK0uN:1:渒_\VDAN 3YK ÀAbƩ9jz\KaTSZWw&"HJ؟K*("qR0 J}Wf$I[_.5ֱsm>6+zĠ2ڿHY~Ʒ^ͷ_]RB{NsIc/VU/qSW(R+=u}M:Q/T}tDMJFM5)qNC +3" (8~hR9J={{ I! &a&AYR̲Z$bzuh+wGC#C_ )=/7`wW$?o1zLҡǮ[E^ϻmS;~cfwwbNL%㴴sȅ[ s^$se{AZڗl6*`m8qȋ}.xjJ[&P>;ۙO:3^*ql4JIX`1RmB9P-%!#Aݧ'p9Qˬ`>թY{- (1{{e/a8U' %V*E(WٯBȊZWYFN:96;L~owef,W˰xI:ǹJw*Ӕ4+t. U侖 8p+0ZR %::νדjK$;bw8MVc"V1{hmOf?&qVZ%T ![A& -¢vVaui#|}~g-!+ ~ "9 ;>1XԾ )GC0DodQ%\uk rG 1.`^_+{2nK'U|TX(({ 5V;AS:\f >R B*˙T)ŏÉ{La ,?zҥMyT)*]U:!d_"t+_.[  i|1 ͔r}$m:u! r"3q$>ԙ-',sҮ߻9ꫤ RZFYh),tBG{ ɤfƥ8Tw%=N̏lfP9 !+D.zp څr_fp|#hGv9Ƃ=.xeXPӅ2siALJ 5K_㱩zWJr*UŇCPFZ~cd-TmU#8!6(`owu JTr0S{ ؑddCgz4v2 &|s=Sރưİb60yYXBBv|kY?yq!y(bD9uT&@[kZߕ7x@4mM`#Jqٽ6IbJԏo!Re@Uz0s 8OUO!ITidu q"+!t `uDŭ 9ޭ_mJQx,{1Be@fZ F`-fSÔFjjMSƢV>?5/>s}`aRacZHw f& HYXw2+ym5,=pX36H r[„e[Ȳ*0zE&}X܊RY*R[wk 2~Jb[꣮9a}3s zb rz?Eտe4ք$D,uK):I{h$M* H"u:iwIv'eTZӧ!/(yD8PnS/;[  Hޕ6k;;\@۠1< %q؎C9lK6e#htxH}\X;$GW֑^ّ =$A>cSFsGJ8;8u™h"ǴPIb %dُCʴQԛ׵mI&X,u$NPH,!XR?O Зȟ,E>B_aT,K4qR>9{F˘jHp cgNGMj!WVT8օ8v8h-cX@^vFhB`ANJ0V\葃zeZk"us cR_D=')c])St ̄]EPM)QHIZsn-pP rqHEᎯP곙:7{]!T %\Rd_eEqc>Ij!OSK'(FUl.KJ}\ wePh˝(.Eq!i+'a6-o &aD <@'v _t*nf9VYd̵A 싣S "gB^  qlT^'6zO/p!kqm\dSӺ%1@G$"X]/GvNꂲ |EAC?jEih0V&7uHw:f+Ac*0A,5,0!:zaX1|:\._a_-#WhUz5Β,շfn JA@20 E>VOqMӥ0r~P:˱zܙsQS&Tdgэh`{5p>w+-:8D]vI֣R._ǎp*UjgLX rXNacw:Iق hQ^۶p\ s{ 'l@}qɬk.[)?f\Ηl~`vZhȨ`dr9y;X_`U_]`d@a f.&pxO+8D,_VrgJp:B2~40ko>'*$wx۽GBſ6+^D%[d\SPRcїB?A'U/-ʉ_iy.ɕԖ3(ϑUpTpQ9Ywe3@!Ѽ/G$[؟ljdD&z ftoBIEɉlu+bW!g˽iz#gd}J>vcI=s,3Ypw?[d :TVuUL tql8nqu + AE|8h:Db\S_s $R' aVh}6 ݚ EMkʹj,ʜrV[N/2I?@, . X?ݏE0]^P:Otk'BRI`Wu `6oEw0y"趨a^kA-2+3Z{(:8>KJryM)ʠ5 Zh`콸t/W)Fh,*N ^Uv( DSt0*{5Q ٥>̭i-pcѴ 1fw*ifgwsj0DUp/[ش"cbZ>-:Ŀ\ȸN0Mf=|hbQ' ؍{a=Xp\tcgLhS1ro!s-Fpwh:lN!xEvMu>tI.m0v1_dc ݪ":P"sY@}C%JIY~TM*w"Ȅ.)[6nS\NQ ^uD&-Nu֞/uyoIL72GVaL]dڪ?U[3?(Fgco4$;_~/ ϼSMؕ' LջmK9/n_72U]e?ךۯM/TT"w/o3} Qzߍ=wۿ$ `kZל߫t\ۯLfn>n4ݘ,ooVw&,Fq {7lUTzW,XZw\HG #; ;V`-`ssW&(3>F'`#T#jMj<͓4"cL;|o^+ゅ_+<,@M۩*EwBf5ؐYu= pwӹI{M^O.MzϑVUi^^^A2KyᓕΘ`+N\ց6%yP$@MWCe'kY??\w_uki58K=ǁR[{j 0qE86y&Ƶۥ+Ճ8*мҤ~Jf'ª"sG]I\e( G=V/;BG}6; i茒; CM~: b`OBB&>m~H~( _}T|>enOڡtxu{ ku(n I`tb`u"|O Hp"iy tHH@ަFQNt'x^z.c nvA̺)ʾ=~)xHm!!J]|;fG_]Sllu:WEțm1NIԕㄠ\g#i;>܏VYRP BM.cdu\bPQ<"ߏ|Z^MwB'RQNcG PE5QPPc&(4/Oy;Sn˙Ea,jr4O,YVԟ9^zAxkt!.R hSZNҲu8 {x¾BC9/085'٬w Ew s  H~s׹+CZb 9alaRLzdDM ۓ5MC-Bo'+.#07:"khJ^|$1Ć=D I?w7'yRDh{SP_O}!yy W|u'O[:SԔxоi&7%E-2xh 5Uِk~W-o+-<>r=K",8;$ }R$Q'YBQڹfy6'bRTϟtADVH2Ő@$`L&*R8/piV:{ZY=˼,oԴwK)TD\tKHVX"ȉB " .RAA#n)*lZyGe]JǤzzKq*I=B%($C,%EIR(پ1 PV釺5 9.ѱ)'w, hlUվs]A*@`@7S.Ӕ[V/sqvA^hNU΄!2cʩ!0ڠKrNs+_~]${lⲀLjPz |.b}ȇ1Ǖ޽ˡ ;B¦XJM*`弲Vv΋M)($L9j_` Z*i. ^n}؃d8J{9}ܡtZd R0zz ;7EPBd$p>sNfg࡝}KdJ}W䎓M븋KneaSdHV ɄbF _D0vgAb*.º퇘V^(: {5Qb0TdqQ00{2oPYVIyu<]'`< 4rܾ&R!PU (DKww]3B+mVOu}&- 9N6mHE,hC5IHqViTn$0FޭiJ-N*g"Yuq~)Փ]u¯EuB 8IQ._c[  XvmDcC2@ E\(InE'EƨF R#.F5r'=J裏Ry}iEؽ&Mk<Wfg`;߿7e2u',Mf7&=ޱ7_&,ooҧϙz0g6_Ϧ$+hoY^H.IڲNkm\L]6co~;?X̓,iuB8n8'%yHȥae֯.E ~R5^YXX'#,bmh-iE~BO{~jRBp3VK?^ypY+v撐^`0?Q)㊣勹Ҙ@0"P6(,E CIsBzD'>KBBpcK1䁆)}&³;jQ}oYT!WB@~Okq#3l=zzI |f!]n V橴*݋˸ AClJYPH2cAZ&,2_ս}(Pjo%۵yui#n]?WTzB0ߠY'j<{V^ -'YfXK,kixjXm)/VWj9A̐Z>k ۧWqBXSL:<)P㡝=47cM{€{p)a_wǷ҃Vٽq:iv%{>P]+JCPeu6!˫)rһn^6:뇸#Lpg!GEOk/#KKخ܋w`['~{LYqBt_wJMp\s^r9># ]R1cIY_ca1Ų =ko7e`v$~0opYOlJ#ό߯8Ϟ7{٣DH,b=ww|K6W|E,/~[ m߹olOd&MD6KXu`7kٻKRm >^W1{GОPy]_uM5N1zgѬ-dv^׀8T)Sr@ŏ[xLL}5uAk 8nW{[rS9 ?QL%g!qH|__? |G8۔jIo tOXS|׽w6yaCIji m5 @[49Tq}+U&&[7J 1WNa4\c5Hzg.N*M! \rp̟DP߲$kÑ7i`coc hZꀐӺRRI~`{cTn X PF$3 ɶ>Qi?9V+eGϊ<p_ (7 TTFopx;&s>PB, +N{3<Ԣ _&my|~XDRvӊ:U͢;vFJuWk~% EKRJyկIJ4~K&HTZC_-!ho~U(>6P6D dpZat e C"ێa,hs J/m.M.|d1욋\gW Gl<PIM̩#j6G9D(eZ3# =0nGM@9WB!9Rc{JhNԜ*2ad1alMl<}a$SP:WRf.)QybҀv}0Q[rfD~y(>)qe;d  l}/η9sx}STSO';'&QA==H8Ì+sis1-,יztcw7Nq-EG{]6rtZ{PZJϱV.2 cCVqpXRDU.>6<zIp2uS3`~mwftj+:sB@*C\ .$) "nc;*Q/™e2}~|Wm8 A0MYҴ/t8cV׽~tW\ޔw`דA?/"{? v[H[-~$7YYIqK;e24ɄT(&2O9&y~ ؼ%rVPiP>^fp^S@eVOzl6S]Bf}_^1Y3JSg.?<>gEQTޕ$k)RUH 2FdEy7Jy7v+3Z!98CdA9| oa8+Seo)cQyXN@c,\lDONuwy(mVI 6d.4*;jo4#?.׻2/ԡ#:v>[5nzpT"LXk2M@ ?-~4ɫgTyB 3k:~ǫxP$0kbx/ˀzzS0EU`k; zmo;kL\[?ɷg3.{z?ʴhS93+Gq2ye/x1"+m=MLG!#w{31{OΧ&H42BxևRe4u>fxFUjIIKBVxydT(^P(޿vв {sURkl\[d< ibCbR pA֬|3c1nf. ݰoq?ߑ:&xV+w2&߰b.AQBQ{"dU^ч֕|>r I0j-EΆ[!P Lϊ[!v{`^h9~@<9=rsi݌ `˒2 &IBq iWVV}ĵo[jUׂ9r vdw/߬j69N~=/H1r`.$FSoRWP-oU}߽?Qogkzc-,9{HJ٥{sjK)fz699~5\Edm}3_ OkN֦5˸ xJ&$3V[.[=*rKϿ\3ε6R'^kkҊa ^ۋ)!_+f7P \FzqFwz?~4~f)S8su၏?Vԥ; ?]QYK.vq QV2HDg> SAy0<\'z Ls"P06x>0=-7ae}<'0{Rl>d:I/&yRb^y4c%]mZBC]dSBܖ_gжe }JaHUe`@7oQӛ{=>)ya?9K{sr`z_ӥ3M}msq8sf7*B@1LƧ])8Q[]qq,yG /hoU[O.̩ޞT\y(ߚ̵ٿ =p~ @CI7Sfb=\n l `צD]Zq 2!k:0$jÀަ:8~ 7jYGOu}Iljt].~_+s|<*zid|`O `5oEC`埗6Z_G\0dž$;e}Y)8sf͆}?U3oF7Ԁ{{ڪ9>XY_#/{`mٞd q}, xG.*_F N=QzxyV6)Y bF2yq:?9m}IS. 5C$+z<ъT|4hAY8Xn[7[T=ķ@YUn[sB;/ Pڍqʇ?)7[W& w`hdKf}E!n To+UznƲO3 }X UI 0tꠐ/xj2-V^Tڻ])ҬPoEzK~,962"fhtJ`VlGaG.&Qw{\k÷_6N!P]_uzG~?9?y|C`\5Į۝(|.WzCy oQD)2ơhK%=>5n:Uޥ^I3J-r'Rͬ++goQ`A*U\5q(-TI3 svŵt\qާkG0bv\g mkW\gTݴORx~s#1Llq ,քrZ'"!["X)߶rJ;~9 P"`7A} G&i:0.}2P;&pIf9NDAӱl $֟l_{{F ӿա?73djUALdeUXKRm$CC IZu 4,)/kyz5&DJ*;I\ҾWB\"F&CLDIZ-Q&L g6z'ZJVx$Y2.uk4sg*9qq<>b՟_>NCi>ܝ>P4WȈ|@H<ߛ Lu?~׍d8:^LvȶOȭߟLNǟ9$>qE/r{?և">MhKS8d& ?AsQIzX)1*9=#ϻdUNVdȳk?.T)\|7 Mұfd)ipH: ][%cUQTcȒ RsVXdmu^ BIJGIszwҨe;FbD !lE+ehJЬ%m^b,eXP7r댴҃5O#I\DZ2倥)^i;pry|xR1?2>6vv)rhNUW@q]`)@;!i;)p)e޵q$_(9o!Os}X#lIMj5I)ruWSCL"3OHp&%FL.^s%1EK&.%lhL AnXUx;6-jl NG50xBޗa`y:ۈ`&b2hxիU!V 'y~D͠#{?OιgD~'֘--wm=)ܨY`WE|})8b10~ʻu;e1T #)J,]2$oN[HYRT _`;b@2YyK<+9PxɤZ, SNɎ +ht~s9b0^JH{^ R p,,ʽE` Ůoہ'IuOKz1;+̣(|u)Rk -H+]H,29+ЌV屦Do93u8k}M#Aс B$^[JC#K4 |-y 1G9՘l^hY.yؚq/ oDx/ȗkwPظϓ#.N:b; 19\fߺU%9g]*{I{%f׈. vsF4F3q1<7˔r.N'ZNoV@qa*p`V^:QI,FxS[r: .[5To*.ց5w'--MK ko5w/9oEUQV8=Dt_?73Q;c3kg̢ M,rFoz+ &<ސQ. u$[Gi`1imp!*$Rof| V߲R$`Kr}TR93FaiFai0pWXLd`Ɓ4Ú8sMO[ˉ,U6hfHrrKμx6Ŕ:*8y!0U*nm&Hhvl30.KFVG%MуwDJJ4+DgSvِu1ysԘYGSV3Ar`YS|bUHxef/Ǐ7MY:u4hٯ]X0ƻ?U3FSTSn\!&,߫+κZBF/d*Qsþ)KpR0!X-w˶q3eUL?.wyKk`nHOX8:;ƣTe^5l^MuiTǚFuiTǚcʧ^XOjZ4Y֞N@2f㔌SдMCgQZx &"dvoնmJ"C?09$LS|gZa J3rcGUJŃ܍ؒU*m<{RM=d Tw οy6|z.r"ӡ[gȕUVqq` z.Ed>RLGUSYaQza>ҏ@zNgO~:OJsV@x*Sw0[n+Z}ܚ&Rz;OuP><&*xb 0O@w+O]yaN횚`Cj*эآ{eZJ[ȝO41ʤ1TcWrSLctȩ≍uSQӕ> JN 9VJNa~j`1] 5Bf%+9݃o}Ԏy}DyEtOIF܎+{ԎeVuJA+{U?O!}lV0hJPwz`=`P+A݋oWݸn,R/n^-o/KDze'7ԺWo[}\u!7ތʽr]/K߿xlH`ԇ;>{]}HW<1՟y..i45w~X޾~fJ1$XمECkӝd$ծ*3W9(cR9Otm!J^UǠOU*?_[y#|=n?ZS{q|3`= zxN(Sg0ax^olŗxwk䗟SnѺ̹)Y=[nYꙫd)V{ࠋjanܳ0y֐#T-~K3>:L֐{ΐ=:`i ͇,4*:(b،kox&SCX;T<47+{R9bօycoOհ'~ƽ`N֐/$ ˢE'xˎw99բã(k%1U4t\3Q]ѴeSFm5,MJWU_c"e-jϊN\_[I3xׅhtL 4w.yj y/x2u}sWq=72{h >hWCr}ػ7Yvo8t3߽SdDFM1B`Rl)aGkIP븷QOlz;5‚6Ny옻ճv{H)l{ZPUŌn>UMM g¡NBvEzw6Թ/YPZJ F^0=-fqje1pm ]!(-Д1SM 5&h]9wzMtNܺA3XQ ]n&dasL|p}ںAlCů[Jq[.8f[85~r,1ǎ֣T;{|yz÷!f}u4tc:GFC_3|?] ɘ:X,w.2oRG3{2zy6Rzl&fK$.Zn=  eiӟ"Izͭ2Ӿ(wǔ& ,e=WJu^c٨f[[[g8dրQ%hqЖ L%W.ݟP sψl}0vj݉&&7K;>#;#bG!3 13$w+Y%]|Jxm$5Imq̕eXWqwd9#>f.S5@GK_;:h`XvZq#QytCUie@lӟ]+XSYA2l5Wj͕ ź? [{{^}Rn'3f'2ᜓrNGnX,&˺}">>xc(Wt=3|o'}weaEu2BzC$OIB{#K߶Y\!)b[mo9[@Q'PrC5AkLZF,1A8ukocۜ3Ee؍1]B.t^M<[_o]ϪBJӿx-ƈ`?ުϝ7'pM;5Tb5S@'SmƓ!JvmpP(F6j :w`=`9{xMMO`KV\63 q4=m/ K.1`gcM%}}+ 1D?QwsQԍBqx \`3+1ϰaS2)e: lLdjE+dk{B=M(zw rKSUr5lt7!Rlm[aLG)>ZL5K K#W>s |d$ę; Y)\S4V)Ryfg)v1[QGr,\ 0 j%`n=6,`.gGeD͠O:t$0QuϢtFrݙd6FM}꾔)'щ9Bp7wdXgga4RʧmFD^N/xTDbq'Ѹ8bt1A0ʠkFYb1u7)>:kkpmvi|)'o_+ ݸ7A+3 X"Vқ:_=sRrߝ{7b8R_d@"V~4gx?#>ow筥][a=t@&J7u|kBޘs$(Ny#zF8f&"0>vigIta%⃀2N*t%KdInS}4wMn^_7j,Q_osG0㛤wM[r6VsQ)nΛx/4;xZߍC-Mf#i]&QXLwBtMX=Ͽs#?j4^Fߡɮ(Xldwj=DLq2!G/{txsNQgg?9E6 y:}!*9=dTIvFY%ջjԥZ`"ެF*H~I.)&=#9bG䀾(?:. K$gSJwS:'P=ɒɨmS&*ru2QV#FԽؗWa ch*𻫢flwyO8#p8깶.mu$׀'13$Yr#'!6)ѥɷ\%(7ɇmĽyyf܇ߝ;$D;"@I-DYsf7 3%הּ 3#qޕ~< F(Sݕcfq܇] +G7Vj9GqHm^MŎjMȉ9UxG7sӞ EϬ%5|O{V!/,T ڔ'QG raыQQB)G>s_PAz[[8:,8q)ַi](UQu$sS:ϏSݪ;K@;vK-576j΁.!8 [݄~!vv9 ~TGdpg pd M18OHpnb@)vi!4 k-f3*da}fms-9qqՄQ, +*U{ q guC?I+ tNDVB,VgCKuϠ0x\)`p¹dbXـ$:WRPߋ,n`mJ`b$M[XM̎9a˙aZM}:WɃL=Ƥ4pӸ:$lO]RSԴAěs~"~(1^z9m(U&b}M´~wC1XrR^V5`ͨk6+)gL=Rް9#}c7,=MxnDZRVڇۿ01v!aO_o#rDu `Ɇj&OG58Bգ+:8[RUuDHu-؋n  w7<;CFyөr5!oIpg;m`&)-3Trs2qS=J(_1Bq@-b.%^KW%os6@iJcvJ` ˮf)F)OkUibV!q5CS ff7cl\E> RH$rl&J{%[-bG!Ѫ p]}"+ 9zP K._'ZnO 0\\qAߏoY@qj(I%1%V/8:_Fp"{-p202c@n^y<;Zه)C⼨J;x_Qu͹FJ*,!MgVbGc!@G%d'bKǏ-}5QcdϷJ.(]Q^MpbUbb$}R,Z FV.B YR HѤWYqp؎ w їa}]iQ/d=bS2*WNţM6eMțsC3RxǓzEښ̨fF5YmmQ[ln&~ _bSo).<Ih>L\m_.HsXalrtᩝ?=s7ŧOo ++-)ʲ.5ΔnUD%jGTbO- JU/smUuB/R?T;0HI黿DG jnVpXuB)j<;-W  Y#ShqT+pK>lɇݢHA?+oaCj+@1@YFb@'q5w!۲WU떇s-bZ҇-GrbA}MF=͹B덐C Ά[\!bv.Lޑu!oGnE9.[ G-s_^!Rk~3ov \ ;&GIhke2J̅|VriЂPHWw)"YWZq!VmX-\Bݲ + &T{2?|QB%,o6m1%)KVç\UUV8.H(AG0}xbM'ʲ[E:Btp+:N9*9ǯy[_=9;/Jʮ)IQgöO*EQݑw%.04S:uXRd69d7 KD"#|_gG<ȦHnV3PZXz?V5.D/A)([//!bH4bᄭN%L4Doz^L Џ=4k@!Ca.%]B԰N%B$TsP8#X#+t,;K-uP`0@~.ߤC6r37NW#0:G^qB`]|x:ֵ%7|Ub=N{N5NP⯫B[oiiZ~Az?,Ia3>ުZE&th&%&4@hPj8Ԑ  !;I)jZ]1J]FYByU7yAG,çjnM*}$o[s@ 1 :H!NMkUtP!UKP@Y .>(D)&6>‚h5=ědǯ=PuӏP T@RNI0?"0>}l ݸ) \<_S>Ä{{S +ϫ/7Rokn\*!fTZBnj9ĦxAMof޿Ԙ,6]&ɰ9Ht?0n0WXIrֺ_I>/Ɛ wĐo.y11d񽇻,dj8'|+{*aE Y~9?R{75*&Ĭĝ;-1)ua-E:'u-}=bI6QZejQ ɁIEDVm }Á oOؗSflRJe.y 3 HOF`^Y7UMYB 5! *džZj(JAȆص"epNz/;}>ߜ׋(ߒHڔ? .#[7_S*O!P ~ACOy,{o}q1BlBdS j؈2JV X L@NIؗ(R [(*FtS]#L DnKLxdLag*:ejjACI=R9pbKH2 $$n[[yA橚7o}ea\}ѭ7%ڗFYz/HwtaF~"Ħ}G%m-)l}m:7Ê/n޼b=\p`PHkZ cH C1ΏWw8YR"38p ccg PNps88^) |Fw+1BխI @X}/IMRvwlF s`@`YW70/vM*k&p>b~ؓ#Q}ĕF/Lj2pzvFZ WQh8| :p~_?^e/Lw_HIop| oIX~`NHlܤtn{fJ?;\O$عC+̥'Yz2\#εo323jdzU/F$&e"F b?xȠvOک:)sکTQX써v`Ͻ)v-q0=Ojj^7V#*; fw{3>6V=j{1||~w 3[Lz{:'c-Ew"` =sHՈȯ;]8vK9{4ܣ׫@T#ނ9TkB:&A^Mz|)~nR^>luT`&+ S.ͨAPj&(^#/XO0Qb 5քj(F7sC. q">OIZ|\)5%C %K BIMR#.H5:v_j+ Rr|>6b_3ZΜL*,y+ (V W0rĶx\%R tQnTY‘Y;YZqт 3o |5}p>_, Au e;"k}.5Cd+X yrm4ƫ>H%\թ'زJNi3f119 j\ LV-@;zmtA0jZY#`&^2aKNa:화={{ȅ{̨8Ifb\PW"f7XXɍ" A -#x5jGrF k$jWv'K"xjVrM`JU5*N2XiZ!y1'fUK7I`Lgh[N9.„3ɫijJ ͇QaoC}KbsIV52N9ZrRcVIX7{0ԀVT.Ҝ5q!WU@k2N^X[O110rD*G17֗/:Lq["2T_N1[>GPҙٟg IQԋ$F'' !;~Vof;ڪii1'(%ِlhjuȺ[`U!*}pT!%C-]5s~( L0w ~њ0}c 7V>TW %(\-0x3@i*}*j3a&X /8O)S[H9 yy* Rmk|򐡀{eۡPe6C]Bt=Yz)qA.IXW`n%O35sE>j{]bIdJ&{k& sP^tgPb>u6{M)c^T;5l?⨕~)cG 1cOG[G<|ZQ`]gHVB)FoR#G)uDb-#[b5%OO+~3&j? >"a:`u"l_BL%ٷ6#" Ym)d[d?W-nhKKF&0EcTLvqcōSP<2Z Dm\^=m5kw~El|-qZ V]Q)SnFM`b {t=q:eNF~Մןw-¸5RP/`_ lv0xҁ0H\ܵ_rVz^c .{kA.&d`g͚D@?S w:_]|sJ )p;3K^/BF6{eV[A.<5ghqdd?h;ab3՟=bj#vw}YO_wGӵF6Hjx*Nns?ݏ0^va. n`^)/?ôovۄȔ|0I׀bezWw3X; i]?O5 lJ6(?5HrUFS$%j_W`%*a3W}=fl}W?^U8B׋לk-<=y®ShpW0SW528EO߮Vzg5 a6$!7'akVϐ__\ĪPǓPqS.xmUCMPhb]S1 CT%k00ͩSEdr(\ zM9!Gqc}0Nuo)M*}$gz.439룐 Hl%"O{&җ|W/oE |ȃ>+}>,}[P.גVIIQ ))W~e-%{1tY)gSa{~Z#OG-f{A  d" !Lv?{۸d/w~TAƹ@&$7$(}%y ߷lڢdJC: 2c"UݧOUaApEgNZ?"U;(?xYQdJd䥡dG TC#QP tlPRX|}1yѭuttyc o Kx>%Abs E;[VN>h՚Zi&36G#ԛб\'Ƀze"eG9Zmnε}}+΍+~$rojd{nVӄ”H| +Nzd+j;ߵiigTVL<[=)Ec:[ #\mqAɖs/"}7L 36I>} s"REj]3):3JK41DZC|v{Ͳۺ}=}>iy Oj%_oJ B}-C< i^0\$Kgb ,p 2 y dx企wZRf2LgJC2Y^!ԙVd_hT+{^U`*)6y{E<mV=B˛VyeYb<:tL4K$AiE lXf`<sFq3]EG*Yǟb, K]EjԎա# sbk{$hddvGdKxXd?l AV$)ZGkY;(,#wPM ߎQױcu49\<6,a,Xs<}#bx-)) 8UsLش"k7 :&:%cbv yu؈ya|ZB 6Vzdf-!,C w`ΰTxaʻuRXM똨]{|$QEjԮ=. V NI.a*P,}lBʁ46|Q*yip%%yj=l[6-O ":ȗ 研m#OOyT 4T*+xрt)"˜ܖ>RB9t' Y@1u,<UnJFtdEK%(˝D,K0Wx)e\n4Z4S+%*Eܫ>t__560tkU}{8o̳O*ɉ Յc$m&ѐTXƫԘRz }FF'} G%f:M \њL9 D NKY*N) sʊ¥"5,ಃhQ(~RGF߱.AѪy:3yQi7% Q9tJN~L}[ax$,BCa)zTGei]-즛U~ee3w|^LN~XdGb"aªZGZo BN5F|b>VV##:0tF7a d֑Fƺl(AYg߼~ܴѾ꧗jO__mH'(Tny5ξ\iL'^V*9eQf+g}vy*{!?V1Pҽ4Ǣ^Av0+E})GE60ðm^/.Fr7ϏrųbrwJ_ozq/&y.F@F5gq͜|Rbᬡheh @j%혷rdK|KXv)J2AL[NcR/9?ɰB &#7so" CIW'O>'Ȩِ-Y2axC^կ1ٲ,ykpO0AHU;*~@fzk;S6{Og h2O];c߽7]5w`IHG6$ݻ{euXkJa/R 2q/k{w 9X[tlXыR мjEvKlǁUR]e,<#=^,0z{y'8fYv,G7ߞN:&ݥ tCR|f.wi{R[TJ"0^û~~*|t:ܞRњgu9};$(t;"3907zgͶ>q5LGEpyA>ӥ6[/{P[q:6VH:Mh-/8"'VzhsmԾř4k+}1NNz_o 2Bg+ OyBa?vPĞG'Xr_b)j'))[D­~{%7ÊCc 1Vb@&2E7|#:{Y1bDb'eb2 2˶ ӮNj ;7m".=i5xB!cCw5jсVZܡ]]+<*AD8/zjiAnHjίW29qB=,uau`vz+4'ᘇ^n֪M}NPwǷUdrq1cn?'pfO- g\%@. vY1=Sӌ F&bVIiw[עy*Y#?j#v0t[Iv5adb ̬=!x1ϔ@YzLh^_hT*8 OVy냪iO +D۬zf<}>lN)t5"(g1Pd-xeFֺlrYI '!L)<#GnUr nM=mި'XMӪ\("IFګ/I=VOQxYWtE- =,{3z-HgB\Zt΂61!84ޗ#6]fMuO߿:]xo*a|K7Λ9O^f_jU川NHaש݄4 }w&6dDiB I\Bm&vR%2o2~VqaX%WPf ,|nDuDfMY,C2sY UKM"2PXǨԇXjsBDkL ((RyiBxeYQj%\! 9eCQgmѦ35]~XUKmbƖ]л(kZӋzfc޿3W?4==9bpJɿɆ7/{w?= o6fK"&Ey5 mgow$Kkou /A7mn%8ދ2)12qֺ 1QXis{GK}4 tF3}bDD-\2&?;_q|6 ȪM.-mkjWz(/8 +6Dg?/qqiAW'bGj4eϟʤ'i_;t:n8Nk~7D/n f. Շs)dž{B X9ď&cPV 1Ȃ)<)2I^%H\eb6O Y{mS^cϗV*Z+QWQ[˜t JPllUpvRDMbOuTY]=:KHX ;m1+PA+uPZ^_@Hxp 8se&>\ВG990NJ%i)5q.'U՗gR< M=d*Ǥ Y?l?-V珕w$_Rf?rO>0;Yd~bއSg8v[ IfuvҗHW:? o4gq{bdn+u~v^zX=9A (HF),IJ5[:<.G#x8t`h_E䶺AMO`Xg6cv]cgFS$#qU;&eԣEe8uCAƹ*PA *D,NX p=vz:OfQVRݕiKZ3_CgnT^4A":eKɷt6\}&A,?~t5ӖVn?^F{pg/4eWŚ9t2,fOY,0bҍW!E$WdOzR)΢@"l $¢*fo-ϧͱV1 %%V^&\EK0idJoz.HR?kϿ;}ڏ2$dQ:o=|q~r:̟Aj2.nP^XV 1ݚW3H;-›x?{jՇˇ&]]?Nxb$oYeH3KkuwOLo?Y G~Jx|/%wx:ɸa)dqg>\GƗ^?-57-mg{3}]F?<<3K?C8g SsS'99Oa3ԜujP􍖆ND4E`$lG+пXd 4_trQL"$N8Borq7k&j@hP/ f]T`4>h.YLX; S1wgmp%}OL镔 WذcO4pf<=y.|يxi'7sm(m澭Kh[242Je֓a QD!FDMsA+-fSdK]y>6n L;NjQ勯Qu/PD:i|TFD4W TjǍΎjPHkJA cq;ځڡ17x@﮹O[fE!7wRMpcQ.98G[Z*BTVDR&krR1xHHTy8(2'v*n;4SGӬ|Lny< 78'wfs?<|4+5}oQS7ysLOFu2q7+ "h1FI½n%_\yyJGn0`DhJ^= .f2Oΐo#̅!ۃ ܤa F6݇/ G].<[uT Ҩo3cY>\Rĕ4c wxG⤈s܎T,;hFqϙ6Nh'f^ت!HH*(HU% lgR͇zsRvj4؈prRS_xt(_}DR7u7{껳?+{ՙ mbٿULS2Qj`^hhOr<`Zƀ[$L{2RI`_evQH$4$(4 j4+qkHUM\4EՀ`* CgB Q"\ YE(')"D Fk٨WϞZ4%Kg( Y!7`"&x(%`$#dJ*r;HnDlg7/FTK|@1h2^ڹ{>Vs%q*w.v KFu;cm#zBC{7m<L ~|s60^=SCݧW\ tֳgaZ#(L!ymLrU}NF[l@'sժ5lT/oFv-EOwgyMPvK#zf)#+9BBPPREƅg,QeH6c3`Iy+$)@p^/Sp "(IhS:RI !ՄY"QO D˘D:DUS*=XPRv2Dge-Gg A-CṬG;Ǽ@{U$/t#sr"{)Qg~GiUZ)p -v<;(cޟf.k< N8=Ռ[hV@ *OnZìu:͞>{l@WקZ/SΦ30ďOyf2}0S:oWuL\-f樂߹b K'4vldu{gh.ME~*}3DG9e8JB['QZ@gB~euRQ{PDNw bKP$"6Hu *#PO6N ]y mITT?|yJH09ͻ L%g}u*` _sD=ļP\ٳcyJ@/y96n|ixΟ [nC.]+Lg9k3qT,@ՆLys:07٠Z{DZ,`u5kԗrd+5kxK_ r`i嘦WOweP/!Vd*g !n.w/;?VS`ε矚H/;g5ҋ^W#}~ ?z bRF@a|l ўHA U9#%/ulMAPw?E$Z3Jΰ-"zoDע$ `E},{mso52Чzڣyÿt4On\D+24ͶaY%wNxOPTRT#Wd0 [8z ;oGE Nw5~TQZaڝ' ثs@p[ao?o*RwI&-Щ9L(*P]i/rEs.*l҅UDvI,nF9nFn.V$Hpȸ0J6PM0L)W# sLRj!(tTդv͙Y)8ږm/T.G;dr 6^g(Ւ@BQFHGВ6QHc9nHB3sZM& 9"GCeww}sr峈8k8nB @_n JrkU4>(!Էep4{‰BVv2 * `J!FMDD XI-:ݻr:*< or'%vЮ6&ddB56_\O*B]E5SbK*YP砄9:ܙf]/(pKraisHJiNg e!iU tNSH II3Y:bO2wO9/aIκYWU9}QzY;ͥXљnF0ɘ>( 5zмP\|1_;/Ǔ&3()̽.f7^Jwv#Ƥ⹷2̺`(~s}1Y}Y1Y}/E?RӺ7o8%1w"a4\:w/Mw'\bmq2˙V\VMT5q4W PBa[ճ]Mkt.BEfE;b p~Y__BvjȻ{u[5}N&a_xwf`o[X.4l9G(;G8Q u V8K[*TqVX5c(6P3 ֡M^[;PUxl33p$m`WoEQEy# 4<M镛cԕy{XTъ-ܮ/J xcl;oxGOOG"02J>$vgIЋOU%= 'TKP%d'|.`i%5H:%'!%bƒ1#5&!.M$B_^l4Pd|@5\vV Zϊ4j6 WsD$K뤩 Y%wVddQgsM:NCd[Tn鹌a6b;J"KZV @gOmt$O@|`1)-/;iUKB1wUBf1QYf䔑>qPmxpͅqA~q//zE57ڎR C4 6zH)$e̛U0kѰ m|S@Q(.ٖxH(&1|j"iݲ1aHK&(=1)RFuSdٝD̒b>T@1f"rJ,celӘCDehS/*P#PVdK4K!Fe?Z )ɼCB@9jx0h#zmHv$B@ 9>{j^UCYC\r‡l\7W5hfyv̨\Dyu{J{3O7s&Gz|#M̦w j+ҐuGlnNMYߟOMQz>\gyѨBj[#5_b5 ZM]s /@vLT Ҝ\ Ϙ@!v2 ReHWuT gyujM9Jp7$N.nwfsWt"%69;쀪x:;0A˺;JZNWv_z~|;ވ;i-xCt{!Zzia c9t 5(ٻ6d}P}V`  !0Hm3<4!)!b(pWGWuWWm#C Z2TGcBKbG)-rNϸ! 2]>gzS{k:0SgvIُ'QZgg`,Rpmilkv4y<;FX`jķP+N 9b&$1`@-NPR(h t;u0E{aO_xuȾ!TbU}'&A;wi">~L(y2lXDLbփ|_[~a@rFm7 =Y(,!<$ ~ J-^?_ۏAa!pW[dr̛Dj՞^ѠmѰdY+.D8=I Md&YjxzL|쌇!$?X8-XmA 5u#*Pw#(Ǣ RXcU]5#fc)W?\fU(W rnP!«VJrn>`*4{Jb WKdS :\mTk$e=$1@?123m;UVH ,b)ڣ38C5Iu BgBWм %оz#V9޵vypi햸I\BMRRKV wX,2gp{df&jFm\A8Iems++k[SlM #y ~X>܊`*q V rP,wfO\lCz#svT<2!XL9}틥(0"` 2'fB-̿N=V;h'=T3grLKcB+L WH;Nz*E)8 ,XnN':uQ:d! %ˬc9Mj*iY8p,f441M|A>![CYpW`}]JE4Σ!AjGůiUI'塰CaMYS+Q5 m:Z wZLyT;SԞ\].JcU>2_rM`Oƍ[vnĠo)ǜo<=eg7= ?7N_#,oOpj,5>Y~}Ѿn濽>V?u|zv~vo8eALׯfFes 3=q[5q]:Uk;_ϿގC0f[:`y6c{r0v{5: @&]ó嘯L@S]t*s3n~('{c;~pfoyt6DL3:UӶ7He~.|}f&0JⷯO0O_xx5hU|]>~0ոtlc"#W}Z?fԘ"/m:ki<{КN9we<퀟H} R']О`: ׾E&,}~ ۷|(\IOy05t♳n'\e>5&33l5msܗl6 ƪB&,fÐL, #^$x ,LCY7A`b|2$L%]ڍ~0&_bӼ^\<7Y;m ?}6ìE?If^,Mp=r8sqrkJ3K ( $Pqmɣ-y%ےAu%іdK Qz-Px*|„"%T)-K;%҂mɒd<Ӡ%e{P!9lI0`ؖBa*-yHі|@:hKmɣ-ٖ\Q$Km%qP sM+0gMb H\)%\mɵ&) e6 !\ee/%T(mR01Tcr"+x20XSSaxjP2b,^t=e(1Ri*0͔I(8,A 7Ρ)p:+xDj?fyu l,Yp'Bi-]gatShk fq&) B%*0x4HBZ1X]E?V Uw3+I95_n*DP"8S(w@6/&QEtHİ4JIx`VW @u1#HB s8B;7hI\:aaE^);iXycG=+gmp21cl (r҃]W Xb:'ש hǤ Ũ`A;>*AE+@% M.tG\3~kY0@_5}n,!%`S'TNxMFw7Z2^pHDR!Ai!PؐbsFE;%Y)D!F\ 0imPʥ\2MoȍBF1rOle ˖A)/VlD8RK>jyP`S`R x*f- #* 3hRPk}eqUL1E},P1)~OO5 As-J}(4EQ,Ei@8:0׸0X1/Li09$E[B+k]@>?ݢ*QWh%$i""IqFIˎ{4„&w78^g2~崡'X{Z8'ϧwkV'3P׷t݇ "d6#>;;0Mʫ|#A85jetp?(Lo뼜`JrA$ f5l^k\jS7y\q갑jS .N&O?+:KA|_ϗi폋1 9?yN3$\YȀT/{Iu䣩j>嵿+ ^{ nK®17)ҀrsdcAP ;RJj 1 sIBPŨd{:p&C [h-@o~QȌzY d!- #hTsF59໖T"D g*|vD + Dk1kw_.znk|/WwΘ 4o<}W;x9A7}os 7Asml.}ƫݝf'WÝh]+:VO8|yXmtcu|b+5,A@y[x߽0q1 22GXF>T7Ds!c*A(&lE8Zu9I5 结åڣ38w``|9,&ٴ+KIij~ՔlQe]A0I]]OUuUwU9}-_}NQb_Q㨔~n8M 5|5OIr>L5h8h>ϐSZ*U LU Soe#m7wtjQ ZAN@Wm%7!^[}ّ0uv}lg%O")]/Vrpsٽ8i(m@wLh8մgxMNLw6E 6-.?W§ +sO>Eff*諍F=Ѡ1o^fW&o " G"A:yOb/W+ynXp&No܊k{r% Fu @, bBАKޣpn2`})#7hr^t#ڥ>7H生 73VZ SBۻ񒉄}ڵ9' N %eaƐcw9gn:[H7a/ܩֻq>5~pg9ZN7N^ 0^;]6±2j,N] PfF K*CeҒ( @4,A3fiMcx*#AOɩe${`6 );\kf[٪><SS<ϸe)m}wh%"R+;pDQe-D "_FcNm T,3E& CY#WFUU^^owuP(v l-GQPJ}kaATd$\7fNf 1Cz &;c9j5r9v1Vܟs+51[M(Y"N$:.| .4 =Np-%:* /lyp%%:@2t[q@ĉT:@W>i-$6(H+89TFTT q!wi9H & Y!; H20]P$Z)$(T"Ci)yJ8 dȴȀ$ ׺ E"דHsÁ cƉQ,Ff/< tC$ E((4 Aɴ.ȅ.D'q7Mx(8S휈qA8 0&Dƀ66"qqQ"v!FƂYQ[nBrqvND {(Ԯ6z`ܛ<$ݩ17N2bNHK,<"@|UkLE.0TY 84I@c$Dv% ВlH3m'exw)7443OāQ|)RhUq)P(ȃ"δ&T(kKlgB>Se\oQԸ\hJz|.fj;JW۱󓇒z$Eq*mwc!mW<;o۽ϋm2:qm5Ujd--ZTҴv d U~ F+n1 cjv"/BvQdIuϐL35ʅ_Y(h{#ʥ:N@P&klV/}z"[F#y/V)Wy<3g7YSm9hFN#0 Lߊ֟17,R:  ưM8 El%Bt$xKD;C\ӉU&VW3_*jcf&ՕFW+$VfBAF}E*i{HUӰ^& J5Вھe3SϵA7&U'Qܮgw3ڗ~@f;?= o @asL4*ּ;ܹゲO%gܢ3M*${m泹7D 0І? k"!1t8fj:FiLL}`{@t`NS1 Pt \Ox6gZ 544ęW2ן nZvC=OJuWtqvw(_T*jTEK*#Aqt J㛲q< }.mzS3ӺB1ʩm4ǟ$8B۵TO2 wWn:_̦Q }9?>h&Eȓ- "y$o4W%^ c ̕}Ajۭ'BϬ2?2ƐYN9CjѱYaysYUvp!?ʐ9#=)<=! H;=˅#u:VPCpcJv%'ZjZcוnpE kڌ"[29/!̭V *@^;j!T 63<=,BW&>xM\ihH! Ъ d."J#@eBSg̘{<QɳX.? MAhhwW|~[~Gx?6SY5':NHoGZ*1TOn^F醖UJ!~Z{Ǵ:VB԰̍T*[!(r1e}a= wuL#Z ] 17_w'kAL5A# A~._<)l 0p=TQ{4hbML 7~KG<)2nC.gw.mGo^л$Zt K]j+1ctp)Bm:պw8m,8y\&agɭ?\&`HBu)mQx2;4Z/6i*#;Gk{6lR-2WfnQtO~?~nB5dN}REU\:?y?%I8w|Xt7PGD,Npy¿lUқ3g(,6탣S3׉r9WMëWNOcō|3!$JʓPxg=bNHKtf1\k" O8z#5!Kc)0ƉTz7/…iDQ(G# ,F9Un]o:(Ln+L6Kߥtb]f*JAy:tƹez^6G@WxRpP?J{|"6Fs(Noc G˻ČOl#7;|Sp&eORN-k@i~!$ԠM{%hGb Jܗ~+oTvߧۑݠ)s#f%+ۥ@"k@ B%h@hU1Sf14T"ZNś+u55 )Z4PDM'` ]l0ǚq6t,rSfB&Lu,cf{C0<F6Ǿ@L+K°{cMM 8G.8NZX[CѼ[ bji4$fy<}ESΈb`|/;MHB_i.Ltvk҄2ӉUI:ۡew갳1+ːJe". +8àq/^wiz-uB6ww'.FS~H9"aԈXOBJN6c,?U\߀ZanY>`6 uM. '-~ >/a7Kg|9Br!bV߲bywhYD5y Y?{WƑ J\A *J%WΎ:ߗإW g/:{z $bwgP(XNݓR,D fcAӆFO–`$axt-)A2ղ}μQ/$Tp*ZRҩSD-Hs:0>vB2оN $=~ f wſFz[k*o9lUҬ|{Q !1;@\ G]TZ9'jM YcD>Xgx +̩*FAFZ!=a|H =sAhPtZRqq!a kG'2eMN+4fxQlDew|Q}i3zܽ3Xe{ [1Mдl[jS & ȧ aO@aR--iWTLLDl yQ"]\7#x5vo} b,;\'?Oaa ;!f: vşÇr>D#'Jju1G(6`ٱfvv gu||dt)kd^X9`)KtWUZ|qǀeOAs/60c"33UCݕv TJŶU,x &(p`Q)'\P$vH_Diq*-p؊"/>"iy6S+]?Y$k2 ?!Cc VbgFpMCinPsrnu^ղ֠@ T񅝊]T*ƞ3Dhn 8у~ 1r^ wNaO<3^;"«߉8SD~z+e 7KN ~*)יHI-ů+tFPlWD,oJ_=à<.=J.NrM+yG Y{>W+1xo٬mדɫfyK^SWɫ;Gq0J^RDЁ_fJw(+Y:~>25S=1Ar_c,PI5#+2GSu.Qz,P\~gml9π WH 52c킉FYdf$->FQPa13yX Kͅ>kԥEme~MB- QdH4LkG8x% [c0{DKZKKMl@>H (0 k죖~"5=Uc}ڣW+Zʪ%ɖ<'K` #"X)aD∌d6`AMj.,[[.p⫇nxs}9ZMf3w*rd^8xoge }~̓MjZIze3y"hnâx r0r´4+2XF}^ &ۛ]=ٞJzy $+R~޲nTG˃*:֭?3MhUֆ|"#SL{[M <˃*:֭U('ۺeZպ!!_> Q:'">JW5:%p+5k+QX sS/'mfyƤ2p-r傀[Ăc.-&Vւvj<j`L۾~8\j#MXI$e$)K)f:$sHGm8@EWJWχf9+Λ-lb{#HOO,I>8:b˃E X`mmBQ"ۛ0.CXr<;bAFkGp`BR`ṛ`1k\?/EXWs ~g"%x.FdjdA#T/ qbɂ @-r;ϸmm#NY{m+YTF2fP%,`Ds@,B:k0 ub` P=e#=[HaG |BQkLtme$T6%;3Ю&ij65ݴPH"9 ZDKRHH241<$p"DKL >jvc)9; 7FYrԟjĂ~OR?\/Y~-6J{D,rQfq0ϸoazN`=X)&R[ 59f Db%%r- CkH6C~77ٵE]s;jOV=T-at%[WWG^4r"B{>WQat$CGS ;JRuEMW`Rׁf=pkf&|qlj/֏H Jg-_-)!x")+߭Xu=)lXl? J Sp`S曲 hG)%dOQ(mX ƍ=vdDcM!Oѣ(hh13H0˩ iuH#YW:Rŭt:b<7W80 8ܹ#B7ҶXf Zif ]3ZO=4/^f*(Ar9_L..R>?#:ORASw|qvd17.,.>b&#?5GTcD='O,*0ue%y]8"*0ǧ>ڐfѺb|~E>bmx:csV(:TKg ֲٴFjݤTR2}h| }aZ">o!dt(bƹ(q8ThD` S,;RDZe53 lEȰXJhB] hTTA"2vҐq3]#oc:e:f&?7dl3)E :l:ㇰ4O9|SJNᾱ&fB;#F?߯X4u`{aNV&&鿧W~ڟxǼɾLxg߼\a<.>=zm/oӟ p''=M/5oAg%!7nv!|D(\|00Mt;"E RrHd Ǹ A*a9H~y12t4  [}-0)$kWחsPdB>/PIbg`b)@4 gfb* *ٟM+yG|͓o.쬠%{(I*,& g 7\gf|Oa~]uaӒzŝG֌j֙sc;uVvXLA9v1| v hբbuei$0\g:U@TGUWĝ띻-ݽd^V@su-*.mV,6$fp mWf΃f~Z[%n t͑1n.{0._swxq *;.}⚘)>0d}/` :Gk^*ܙst'5qaFt`ȂGR<Is[[SA1h\!jL` ٩c@YPGEac;DaIAb[:#t`&|bIK =6lWUҿ.[5&I:ŗz/I*}/ RꗺZ`Jw$t0E۠$JN8meΚ[zKuhmXv=iV.M;xvS59ގE+zePWt#{A(Vȉ9Z0@fÑݎ㒯?P"9C"_j^j1: 奏b&9)̕SNKu#k;!)F/5z1b=mX]K1jJ *W 9Rq:Rc>$8U"@QsJ}v:Y@Q:4uvQ+QgBG͑PM嚫:˴ NJSHpE"C8r#$L1C9Ū9+5Z/.()祼]ٻő$WP}9ie43't39~;]E h ]~l̇\DddeD&cABx3Jr P̗VA(8|yAKg^::)ǙV⮿\62p]Ώˣyoןn5啄꣼|WW륽S(_}2xCQL蟅${n)K EI7ki7Mh7W4hݮ?5j0LhSև|"$SZ|Lr݄T7yHtvZ k*p3[򝋨LU+n4wq$E7\yDt\v^>ʮ)i7gBj>$;Q'sGMinAFvż5invs&vCBsu)n,HiJ-h7W4hݮyN9T!!߹v*^`\\iWb^D*qś;3/,񞼨s $57Qj|z(SJ;pJc$j)ZRq9Nq%XNUʀ8-ZNE{`9Ri> ?^&> Vuʎ q`+^ViZ\Ѻ:F`!k֝a"坛%775S2~QE1bncIcVkZ$j8/".,fMwr6<.Qi杆ʟ-XsQDO4a`{/wU罳dĨF͒x"2OSU3䈙!3-wfvL0Hg6HX{~A=H 5{{D#Z$}xIAz% Ks n3yX,۟;}4|q^]YǩF[rG`k0!khѬ/\šL1:s{L{o<х-1UrnjO>~|^Yxn*M`j{/Q&,2K0 1qz' :]Lï° &v:OQ1:Wv]Y$^%ж 7kyh=+ бc = &$l@mGDim <1R 6TDsCuH,FÇ" %ØƠ#a10ZBDkV{w^{]79%(9vA-;WtQ}Y[Qt rͦe\MӾjקHGp59~5. .;4Y Ve4RR8!= ?>Sy?gEԙ$D'C+@߸|d Bq OL=‘ L$1K\HYpcLՑnPbD`ݘ$)S<G'p!,E$*FA/J #U-~3]D"-oc6iAID!0@Q`H 0Dr H,Bz|0"ON8Eo\ i'ɂ(1:143MıD88L n}#(ayRB a3DBeB0 M^;mOSy$̴(*'GF۟@tcUq$$orbt8q { bPoKhE~,~D4}n4$;#46|"Oy3@X-y/ˡUqMf^>[{VxeqVG ?0in:^e. 엽PgR#$hl[Q+oU ,Z)Z{7"^KZst Ξ$6] 4S1 h:}:V[G5ў|OAd a4b~,^}/_Ri.ʧYN歱y펟a83i:wov*ͺ4R%-`s=>%<y ػ;)a\jO)٢-1n!?nBU6 3}#ବw A+ѨqfYUƫduv V*4xͳ`MCPj&mЏ9i:sjc0a5CFtuȺb,JcBAkN {]1K-]D_sX#P*(NH(EC<+;,ˏ2]dKμz{d{L D_zdJDrKB!GRAǼ uNk>ZpO^5yVy5^-TƱl\+Dp* `N`tÈʹsZ K4@~+E}&\tGα"T\ʵ|R%/#q u.t$%^8`[N>;@U:oT{\N%N^:Qd͸3/Yz6-lLPHi(C1߹])cMrẅ́>Q(OqP_0*x|ȗ-{VU UVa%{X矯 T"FH]l{-KLߵpecr^~1>&By?[>Ye3~]R WѯpSBaw>Cpl/:'{FlOy@ɮjWX%M*t@2SS3QSSd])_bv\RC}ߜ<4G̱T>Di<ǀZ[r.h\ cJ8.VeS؏X@.89ẗ́PI-g%U|t HA>Qb y U>O_3F0c4AvLlp]P8-c$zI xrrF#) cjˏR aBM%l(g١qJw1%:-}G60>&F5a=4kE>H[U쳔|`aOSEZEy^jE×5{=:n\ IJhm;uIUtu~9, vhĈ4صq6i@iri:u SqQOm8S/2 Km)7>81t\Ѿ${qDJ*vdzOck]^\y5) 馒Vh0{|Nce{7tȥf Ra;j^#A`%|VZ*#)ݦQ+J>O_2 &&?ʯBθ!33M`6^i5ڝ zt.:=Llt}=jlc&F06@%Λ6>іцF ^pAP /׵M#xT&w/h/] 5[՛EǗDPpP6գPN1Mi4'qM< f_3GC8գ܏ӣ}7×r5^,~ūs#tz39aDRez-K"w.̦C0ts:n)2H+[I6xwjSrM}T:87g2?BHmY -=C`CN9;V`Ne@q"V(}3ϻjBٱ\+ΈԤɎq+]2PϿW"b8潏_)2x$@cq].'IF˰ȀN ᆘw/'Pic'U-ܭ G`z9t\®:HXWlu ;M@+>9KX_ [:\^&IyKCU?ޮF u~v w?.?^6ǂ !"rot6Cr$g 'ÉC_US__͑Da:7e6ח:){Ltwpu[>joB08s$77)b YvXN[O\:c{*;էCS]\ɾہ Eҧ>m}Ϝl_=~-Xě.=tnZn<*6W^>O@&m%Q{'"DPyO7&t!>oFƥ}t DʸۓW۳Dy!#V z$)•kP/goˣ~txp^wӷ,=9;`yfBHc0Gzo V(;׀J~)^Mdg7WaZ1}X}w'gN >1|{o΢+QJΏvJR>J_NGBl% D5Va|y܊ zMdE~h|:^vkGegtϾ^n.o f ?G*Noê&:'|*y4i^3 jd>F-J)/"H1?kODV@@KȁBY!x*=!š WSc/CJʢB5llܽ0XI olPFb*ơ9´2F܋W\ IUpS*Q9ڙݗTJWr%Bp|i9pԾWd'9PJI%QVa|_]L=Qu[;Oa(h{ ʒ ƚ:V\z;weZIbE5VRY1A1 s( IAꝧ9)L#13I3_sĄ Ebr;1 R XR!L+7B1G( j(&`$#$q/L1oь{/AQj JI۞0*U1FV@C ɧ\}$94?IBNxej%x1fXK,=E0;³oCr!EWY=ȥ`R"ʤ)T0,8M;ٞq٣yG\rF nRs;Ic*F] ai~ xx( F8Q $I*Ek$)|Ɲ-f"." @X;L)X:tXP@n(<ELB(`o". c*1倶<6C=QJB6?\\h&]Yy 7hy*C ⣗Ksg <~!4F{"Lk؀*@Sb'W&4M= Wem⡧/^  -ӫxyPaC6dbDeK3F >U柫ܠ=3JOJJ6&l WAʬ/2 unct4F2j*^Ր"TV6#*PmjlMWTRYl :FIj. vR'ǖp ˌ`z|`ےVR,K?X~1.Zu-wf#S2 <Ք]KgbG4% 'V{ߘ14y߬`G3P/ʄ\G |u`tȫofq$*20S TQIG2Y=%p9"ʕ}!t)Cqg\1 $jΨF4pfi.gtyL9Xxr7yZNuE-=Y l[EY%L)X)塚nGN^$'=|x^8jN#LBsr*=S*|bͺ5>ĺms+yJ.^s+䖦vͭ6A=8zGv91A?6%sttp?pTR&5;?Իف!ovkQߖmI޲{7`J$;1u+)qnJ-ˣǧ{ z0SsG-("[r6Jz9Lz=8<@N2hN Q-AUK8gIJoggTyM :x }eeR53H]#HLZ_xKr[1tS'={~q8vߟ:w3.&1>4Q)iXgaZu$FmQ펉Q65sĈ¿0w) {rUNpOтh' \2kra[5BO(X(UN3gc-V${ZX+A7/Ĺ$GOmP>07X2F Զ HNT *FԤl)ȀZUoѕ5gşDkEWFgoj;tJD/G:bIV2aABn@J-(eP}m8UTHJyu.9=]$[3tCHwL%EAmDD/_|35B0Pճm ExQ%^^,F|v5ZORl&Q?Q_0z2e1ȇͥMgս(Zu6 ^=b5VGǗYX6^,OOyhWI*OA Etrר݆x#O+u]klDS[hL ~vSNn4H5h!Uk햍hvkCBr͒)PeE6Z#uW\~*L~[S˲jf6w+ w+9`9xqGbz[k8֘`R< #!6&$3멄]i8U[Q\$OY=;t:Մqׯ9&JUm *1RB ^)dBrifQ'#Ow%QNkn3!mHW.92%8|GɪuP EtrרL(IQوj6$+,^M1q -)F6`x;F4U!!_6)Lr\˔t5XJ[ݩS}O4>mZ8p9ia bI>}?E2kN.{Nh<|5'"ŪOvJ:|Ŧ']D#ҟTOפ w\FIY X9g{~0? $qOle)qq8H$ .G}3# H.GAYđb8iA]U(M0{KY^BQAcp,ÅB{wԢ!& ۇfqaVL `1m!V;k#JF9wo {9#,V ʷ;a'gQlD |6ǃ^/wZ/4rLHh%DcZH{^+JeS+ BeȒ"d;OJ2@TO"j?EءfF'pHaF)5)"s"X{CA_8BTȁZR7NK:\@SrnPFZ闿x^ DX?=7Ь /(u˟}RBqt3OAvWMyu "ƈ FDoOFdX3F=.,?g~f߿oSx>9HsAGWOIxzTkҽ;dNd97o~Vwgl>|=b]x#.oB<,0n]^" 7gĻyx7!Cx+wy1t#ÉA~Xl$fc JxS4ڜTfϬ?hyʢ%Su|n\nOK=Jyl͖Sǩ !|[q#|5 x\e;c6_&ñ `:$FTLyqp;W/n%`!`;h؞m1ڟ*r*phLu߸~X%#Y)]ysgʓuT|O$ŔwjM߫EMD`T] j12Mݚ*!|┶4]x#b9E\t+6Ԙ9;&r>,g\Ydxi㡎 €R B=>~ގl~n>sB@-XVϡ){G@g7;QkR2#Ҏ}ie~i6@dE%+A/eHlLtΑAWY˔)!נEݐ3Y,x*#{bU 頭U72*qD Z))){2vhDf)e(DIG!FlDWAcCFa<DE UL 6#]smf od] 0mGˣH}Wc]v$ӌݍ)#;(d)-&f&ڱe=_B>|ƺL%8L*Ffsjwf4in|J/=mĉ2ˍ)؞:fm֫zڃ){{s܌uΞÇrfRކ ǒjCHF[nʽ?|>@l Av_,;4`vXJ\J s@-DL!|.*{qϰ>_Kލ!2U2jpvk{5ۉ0 $0xNu>fe0< z[%߃ߺM ϒAiBD,p`q+.,Ή؇*"ŭl  J$T@` RX!ĭPbSXHA 7C/ZA(V8յ 霈 X ,!vIex=їHnպa$7'abL䠐L 䠐Hܻ dqPH7I=D*QJ!(꜊ `$%݅Qw`| -(Ȍ,5X$̟-0x;5}} > ƛGe%K> J7Q֬˥n 5'@~,Gۿ0m9"SeTpX k@ԥ25z(Ԛ?E߫ެ q6TFcmg>FL`I[$R՜(۟ҸM YGBH=o7K[ԥ08Ðv"_lc8/'ʇ*ŁXS$7ߵ<\L9tǪh']A.L;2 ,S_6n'S-޾D1bE8"JB6ʼ y0NLbگ;soP%:2lT:#VlukӘU9nEFD?Ox]>x% Pwȕ fow/ 19\E@(S d0Y }1^9 0,"D3 З"5Nf@D[0L@HɎl/Jh@P]xC֚x #`@ C1Oi`8*e @LՇoxOBtZ5>B uOPj'#`YL!$A:}8H)@)#$$02'wG Scl GjqAH!c x󐓻#QxR .ߝ8xY#/sƽ!d FY2ݺQϟo_eZj ,gSl~4ao7vøƍ0nj1`hBF JRH4RBJaBS$a$A$`"yF?[j3t._DawѺi RîڿKrC3xVn&iu9m$&TqEF+T6'iFDG)'4j"$iEҽMuJui{hUR %;m.[Korl7f}5K"G]oi`c8MV9y?_}A[$]Yw#{5قw tߢo??+<^qen?qjUtxybc[6?oǫ3s0'x4G\d3׍]3_sOy;Ydo/)0cP,?xs3i5 0\D6C/lfj*IaEat"ٔD \b a}- ʇtsEZ l]D n]ў e,gF;:$Zz=  ޙux9O흜n}2Cy͸!B9ʿ)='@ Mqσ8m=mh<9`;YNwh [,Fr;Gz/JPD` *"%Vz\AY#ć]psMrTҞ-Ch&0CnXi#KI^Soa0@ eFy~՝/~=ԛ.AWȊę")DCakO3.ч0HA!C[p<-<PȈȠ, Dn/uPv~=<]]ABL[G^^7{'cowTSy!tݽڶu|fM m^g؊*A:݃tyb{~p{QF ByS/l~?5yC#/ow WDFήTH DC-6J)YM`ǧB м^E3lE 8)>x0=>ϳdĈr9}7O**)B4v BBCxC /,5T_8aJ*p”Q?H( HV*[!Y򩬺=hb|L8:ע3]Dy!nK')sXٳ%cm%>8~;O\Cm?,1nWml[5XĖ>8NEtfl)4c<&dt9|F:Y*?a̓_.ʹ~N~;F>sKЅ}aJ>Ox T7F[7_R!7X?_5_)KOUxk-ye@򍋨LfXKy;ؚϚC~Z,U2)ϟa*B{`qsb1eD^\(Tȏm[ Wx?c nߧregۿBBq)Zpt(nWPC,]#<:uHN! gya!D@},u@O"߂s:;APȎT0Ĉ)@6b%OIr|tz­O3zm%W;=& \:`.AMs$)L)YL/W}%4p= m&Cf#O$8z$4%Bv,q5(1Ʋ%a9[+NĖw[ƶ.!Fl,WM4"J昤::얮F PѲ0X,Abz 'V 0&UӤwT"E0.RN\oiW#G+׋ߖIukR٠N9!6ScJ~]d` r *ۂdm%NEk QuL6dGsrav]KcNKBўuiuILc-|;TC(Ųdᱺx `2PN@K  $Utrn0?Jo\N~Emu0(dQ.z%EWWW0D"LK`LGW( !T 0*ѧh5|k .(lvPKd9d9,PƳcZjnZh(*)Rs!J! 0ԾShDŽ)# kgKub.]t1Z @%u)]R._0ރJn-Ivvyv1.*Jg_Nm"308mXXbc`x246K=N5A3NTgc+%u&l9ݤ&3_fOli%ٚJ $E˜&E5 t72X1I8W2V1d+%hVin'u-5YwU ߭ (D _ϞG`hԈZBe9ɠЎ31%RP`u)Az"WPcb1jQ #iIe.s(`oU6Ux S-X_mFVOwr#]od_݇nT ϻf+,GE(:~Zq1Yߵ(6\u񊉿,now?`{slݥk6x$IΗ)NUrI%,2掳l#gլl uXޓu]Bvi]?6S\eTl.0)`d'`neBŜr72m)(k'_;ɰp?7_f47oܮݫ[j %%eվE=pxZ2'+4 *mlf=̖nCu4;̰(xmz=:l%JwGHcp>n%+0X#HTXds3y7:,i{$MR*hh^KcHYTٺՎIk~~C]l$kO޻$.q r?a{^}j<{=|jܗ:: =ѹHC45~~C_-duT?_/ywT_>];QRGBbAՈkЏPmfKK nPWOPDƚg9Ŋ絺2G ~=WJHuʘ"5)pv%wCXKkR $ѧeD HYqu}\&2ZƄlS9`)f#~gMX8MCHF.p@ iWkځۛ% Ĉ;FQӵyxG!-Кww($`0rP뜈՚tum36q%,p~7NL@~Xvȼw* 2XapELFWb˫ey4^/[̚\U [khXPvZq1be)d1`<؄"GLmD|5 _] :wҬEU^wxYt?::x%(t":¹6keHejj\d Q}A/q=+X|~=R2`\ZS,/ UJ>1\RlA& O&8gB[ޕǠ:8M~|g("W;<,zvciX}7 MOJTU{ "o\>3x4˒H|&}͆OVz.O>H2A*IfQ)+u Ue-v5Qo5 p ApWAZϯ%(!ʅnUZjX+2w@Z߻"F@jwp`_CJF׷8U `-k )WV+ Cu$aOpJ.^`^fY>s;FF]Ig<o(xf6;bM olEFLN)wOUws Gbo6fmS >ہgovx:[ddRƨƩ@TBl1*"U2!(m֩+% HQO>{W.0U:mgUS)XU?Jt$xEah\TO2p>$ >I^!sSֵ4Id+NN @KpD5@} bo(HC?[1]ʞ&߮8P<*#P/c\Sk=W_C%wē{Rre>˻$5RPb56B ZLj[J8(It_:,-Z_~ufUi~]Jj0 h8A5`€*׋ sC $"R dR*]Aő%HL" ʢ,kҀaeTJEYd1 O|\e@Wv|t'Y,c l~|Dwϟws̲WY~-\=N]d}h:z׵~Wb4~ok?Qm7;#5;Y<=RDca3(tې\q_H $T"We*ì0/00R19GVl< `#q@ɞ`YRG5l擑ri8-AY$w1Ѫ`q> p\A5ܴZMߔ.KG!$pEyʁ|)V2\ 7TQr0")E(&ʽws.w]8w<;g%E֘h|4 *_ D3̒'^@TI.3הЮP,˾׮sp UFZO'VZhy{$Cm( F51PF$h.$U20$7ه^RAYri  D&E3zB7)WƽƳ=귛]_`\'2"8D&J 2iBӘ1fsfgfwˊnt C+EW7C7-񀬮@Tke c#es oѤD |zsq@[\r,Aչǭm(41OƂbZR`gM45|fۈ$H9x6UX2ZufpAj'P@Jև0W{1D8?-AIpJWWٻWXs4%$j d"T*)B0-4C9@!"י81}KL6ng xn?S0&?Cx(|<˶ y2& éщ0c$'pIAJ8~k GEd1E132 DI#1`:,0a,zag~>##Bq̿^:C%T|#eP^Jqc5և/R` 'YE5 mWEzAvdDnMH:]օJDRY$V'ћ _ӹ0]]Wpfd?uHDFN={4h<:\@iG~5be5$(;ުƺc0*r=0H%x!QWbNʮ@bU6q>(5lC%CEv AL`eo-ι`PʾBZr!ʪ jCK2('Ly #ɡ0C]~cC8h0Brw9Ȟ`DJ*bP &klt,㜂p}AD aPc^{>EߍYP` wV Б1,+Uњ3 JymN1d$l\ N޽AUޞݷA._-rbRj .%>Yk 8VG[-)!>žvh;jαJ.N.N.N.yK1UDR!#+B1 bW&eH*%1?zt,:Œ8h'Y,F5S3;vYoN>Zޗe/dC:OVf'yzPʞl>qbj:|?Uw"\F߻dFyYżZwZLD4[?Xn7YyiWbgPpY4;~sg:1LE1p8Fq c-Irǻn߼/fY$Vt|,$FWwds&wǗ䩈ޛ)o*w) ȡV!\LΟ?V)J }y҂%ruܽS ,  q84?i8O%&V1U,^Quv4}XwG(3 a!C5pX8ԥ "@+DSE %u`#+v]C;,H6i~JdIKN[M1bAJ4UuUuw=HX%gƀ* B2Ƥy@gYRlGNA|P|^ 3HAدBۦk;g+m+EL2U>̬3х@hJ;`,1B[tA7U:%(KNs ז3Y"EPcX%TR"ZN_U(*Τ΄(=?H0 Hy&oe[Ps5lTفu3d"+*IyCmeh!rrKIEzͣJr'bt<̭I%XQU^: `4Dqrb*%bQqAIc>.C!  yvZy 7)B>~pNH*\<*5m,_mccǧ/ /a0L0-A֢&0WҔB)xl~| 9Yf)'wnFB|A\;W\ O*%A1^mrrC;oȦA(B$n\Py$zA >ﴲT1%k+/SdC!0\Zq/A As7NM&I IPݹKĨ`ɇ E|_lz`~g8e%ӃAޱms{6"J Szʟ"Wwz)l6\b Yk_| ';pņwrX=<}swOR).h8O~b40B512*%*pxڰ!Z#XCyfo~ޝnݗ (>~]5%Ķp{@M/H[ƞ^z_b uWn Cyb 6nӣq4ќghWh?#C_1=7یU'ƶ{F~ztF18VIR#+7y4Zv@ odb'FgTc♞P<;;藑9jfmMbg.{e1^΁6y,!.jqj@ <:$| 7/f~h3D\KU 3l¹AT݈녂Ўc,f z_nWdaOHdq-af?%s:̍41]%]P7-BzV'}$+e @ku*+m&Brwo=q[8q;Jo?.wsb*^?^n?;L/!l|͋er9ˌEr qӅ Nvsz)zi2`笗^j{H7J3IVK3烻&51AЯ3r2ua1,UQxfmJ2фQRCҎ0= r:1h#"mmz.ifq_VSKjM7p_مgW'%vcut,@jy84^} zTK[D([AHnӋmR ڨL\PȀ:\>ц*)\;jԁG?A3U252m$n=SCF^\x'IhZ=b񊱒RZKٰ_޷-e7'K#>O$_wa'މk[٤« Y|@*s_߬|;BZAȑ?1Mߓ.]et s4ou>Hvaņ9@pEYUҁ0rW)8ܸoE&97sŕ?KŹM/p'Exbpn;kk|;7ErPl8xr wbsz-gj$#23ArI&. &C6%jxTQv^p{5] .j*1ِ)V~5rg#Ti4@+c`rRJ2 Za~(6]Bʴ ڻ;-,tεݩ@Qw/j_w'+̿/N{>OӊMl4= @0_7=~;sWljuQ(Xb*ByP6nDj]@gIꭍ䟾 ,V+D 6JOQ9y@D=Z1#ew{? RNb5`N_"_{l.NԲI_1)z.+0F,u/.xztNF*`pkMJ^5qٷ>| _myF ^a<;T\$We#l *CM*Þcv֮5ӯՓWyCr:8"L]ŧ}ׅbwZ+us;-;mgN{;%yIsY9 |ko 2Z- ojWg/V kD^ O$'XY{ꓩ,pϬ+ tmyaǪ,m!ZXs,c|{D99VH 2UۘRRXVh."W6D^;#5oUeU&6fƩv,{j?LP#l&a=y٘V-ÝFd4DP-d:ǧA*t'ptr30} (Zʞi?+<կ?rfQoXAe!A}/^.f774uCLWNpK-(t$cUrf] cb7Y ջ럯;,v8tzZpӿ7B[MO-t/ʱt#Ƞ=(e+1|spÉy))dX< ]Iz*TA "*ar-FdZ:c;nYvDn݉]9@r p&J01qQVҁ$& cVd"{B unC JŅ&($w$D(ؤalRa5YĿRԊ{HK΋FX:[6QRr1p3KZDۡ `ZWFP03R; ]/ʼnTR:KXa{2߂hD'uPO!4w緳yW3Z!&\mqvieVr rxP>U kb%%pI`I+XȽηђ[y.$ h*[oAJ9 kkUdebDm BЃdtXd܁44T$-SL#k}˒@Q#c.q)N$F+2E`jeFn Cd >: %XdF*5ucltRj'U!R"Hq` UF0e )|a&Ȟ'ϣXAUA䆨0@3Hky IUr]$0ntY]j$V܅hw-8;'Ajm&2Z|$Кm(:6SR:)sssiIcY+5piT! ZT{9I& /(+٘·w /l=KT\y9QۨMك)$2JB8fB;m3Ѷ;Zڋ E L*!a!Y21O餰BrIqKqx&TFM;`L%0lH , ˨1tw/hNO> lNE…%Œ X-c/em]`6S1B/M&7}~Y?cn- ]T4?xZ&Ǯdz8_!r'@(վý$8<\Bmmq"IyI>I*q^N"[]UYluja<|^k`e( l? >%Qq:Ԑj?! ͫ)*{%{αYx!ۼX[1QA9>HC%ZN3C&< `sl nhOki oyDž$wUXSFfb,}^>̧#7to~*=MlϝvY1Oʲ2+P- o&@s32QD`( y`d`VYM:[ó- R!ېVo5Μ@u|I{@(hr{n}& ؗY& NK$eWYJJc؜ ` aKQK$δ;˨2pgRs?;]k" v&X7Jf2L`{#d8;Y1yP]U݋ 1){2ITsje jFF hwÞ|Lg9 ?y~yvL/Ë{fW?C!O@JY ^, dt.-킜ÛRţ KӐ7R.ljY&tQ2 -BZ"ŘCyq*p}4AuR$+!Œ01Y7\E:p"/IE%F>C&#<e$2$L"lܗL0Tz>m`֬{@0-NQ~r:f:};9]a:]6ZV?=[+}{Uݒkjjf9qe$壮FX2zBDMqGY1it %h.eOxlQ\m[WE m&nGu{RW*ImI*"i.$Q D)\eA -/ ̚ =3Bɩ@(BGA!yɚK ~J㗬Ӥ%$4:%oiauFf(4 -8 ܩL#4 P|^);@F#鎀`ZK|;Gv]2~*!Z,[%HţtQJgZV({"pZGWMNFDZ;Kok0T-6&2jɩ &ql™*(Q#6ܤ,lUӤC֔H$c8:RMgU@!3ܞyԉ1$>FOTM :ЄD>RlS$3Q,2sK2qm{0QJ=X6))qa~ U8iU/m q-M#xLJԾJ5l^Sk"%_>n_ǂ{bD}IzcѴTds%;Deڍ22K qWpXS].זu/a W$.c)e1/$"\l$V)rd6E(:|x};"us.7$`yd8 ٍ] mVQe MGyyi{s[ ]WB(D<`'ǻampK5 ׂ;bs>a<:|8lˏ,ߏL#ԕ,j8yKV:,"fC+ #B4v# v `h6*6BöZn$=>{%)zvInDMLwyK[,RV#PBW)_2׮bnD(TN1d?^faPq])yGoF6Xuk>W yJ() "3kF$vgZW1=Nop?-S5#97"4$ CfєQF"~ YߌDIU0㐘̂yTծZ-)P4Os̈́ 4#/:y2TvbqQT]>ލG/n5ČV1p(0\ӑu n5LP{gkW&TǂO"j**yAfh &-^j42;*Efx .B|6gd"j7a߈Bukt򇚋7B6 l)^=~v=)ʭW_:lkK' 滛V-_&7_*7&xgHtsze07d<3^X% :| >zơv0jU `4ǩ q* P=JZvR@x: 5˭V áHWj:W) Q $nhY*! &Bӷ3U gZ|+rΒJ{Oc%IU0oY8qxofC77Q˖aAF6Fh RRA*ql/-w?];9wO)Q)ZVNDsMHi~'C;_~QպΗ`%y]L84!4XB{n}`\R͋ڗ[ˎ(*;0EȲLZ'riLr|£lU?v"G~d_o'h"֢k udt(@iBWN{ԡn9>TX`u"FO%-]0O$}i<9}K['t0S"NT*B'`H??tܣD֬ǎ*6=$5xj3/2]#z!̞1:M#U:h~=Di"&j(B=]Q:Iez7*?H:\W"NY/B:I Y3yRxZP`X 森!$%un`8㕂SB}sm=+EҤFRd)t$ -a V;VvdzIpTRE@"UA_u~"*/*JC&iZ%NO@P_Tq/=g3@7as)Zz~5e*~=:{wJM_zH**޺x]uW_M JT!h(YQD`XT 苨wxU^ /dbDe UQc?U}Z4hF!HRܟ(FKK.FeNr j_SUZڟlZ_r$ 4EUNF!{_6zzU6AA/;nfyj{"rJi}z?L]8F/1}<(*(-, }f0F9"n>ɑi2O :3?ލ' `>Yƈf0qtޘ?4ތͻ`c,' )6hz>|eǯo 4XE>(Lv>qh聆L{~1I|*'UN⫘īh2R)ɱinyj%N# LiPC|pd!F n` _^ .S U6t܌@lb4`/̯C4z0Ƀݰˊ4yI&8ˏb+v?/x^or_fso^oR<d_އx4;+ܚ,g7&W`ͽ<>w7JW?xpСF dCgh&Xej!t߾ ŧּu=fG!"$r&w(uqڕuצK”vFkYPT e/Es.ՙ%nZK/<>3s;Z;{GsXs?r)iܐ>k 6mY piMl18iASrځ~L>d~A,KB:J!XJJcCe4;L%`Agy2"a֚˨qZ yF\FaL!%qx pk@xAx}aVDyw0זqLTPNV>HC@ i3k9XJxӄY+Ҩ`feL $F2乱.c#lDf(f YlYKu\Ŋ()<M'ӻ|Wit-gy`-#ĹR0sx?o1x{b3ay$ ?Ʈ泦)W]vXf?^CB,'[gnޅyDْg΢Ij햼Тڭ y,Sg˶:5&bPO(ϴ (Tn -왖!ϜESxJ#v8ʅ4UHĠP6q UJ4դBB gJg΢)*U$];KC5rJgQ>9 U }UCZ$}π+ u`?N >[Z$pTzi>,2k17YB>gJT2kagZ ˬ (}fϬ* w/&$"}fϬ* '[GMf$¨Ϭ2%A"Y>gʐvꙓ&3kR"gZL2kR>gJ^fMQZY;$ Osph;6"">%c,V!+#D(GF2C W=AHeG fx]AV/RKFa% 볅Z9n`2^=l ,+~f|^^C[n؛C .(i/V9-,Qx"Xm !JcRu][mQjʻ 5 8kIF wcJ :FJ}WDbi%ur]YsG+x= ֑u)^I13b$kNhybf5@fAtUe~yץ#!Z|4`)0jFgBkQe=,s7J"S8a  j)=,=J ڍ[4Kjtjp "5h)A ym4DjBJ/?FF>hF :o(CI8<] En+2d!Z9BlW |y*R3v/K8C;KqxhBpw QѣL05hC0Khrfʣ`0Y1Jm1@VHdQ`a@II- IZFm(#шmBٓ*=ޣ8PԴ{kfuwY_] ,OQ+#o\>k. ǣX|Tw|s̹.0[gf_|ޭ|aZ6?zǼvfW]F,+n)F·/ԟLgKv>G18(uyQ>3ߗό |4n[!wOwJ\(¡۹3~7l&,/R PQI?0k8{Mf s)(1cj&1hObqދ>P0kքjx*(9qo9K)r?b;L?zun-c%ܗ;̀7Z7))adڙﲇ[W+Q^,ztq򄾜Y^"C>BkG oy6~YZFנ5Ձgg`$G!P\1) w8S׹i?D9%Eq"#g-7=6 J[Kꄾ @}|1o1\09Pq$P4*ԋU|=sm@z(QYt>Iyhu߽z'n([ۢU] gFKfP9O _`e5"xcʖ2Th/Lп/ݟ'oIr 5ߏ;S; nۍvu\\L:Dž/mrvF3GRP*9NbDRyie̮jP_i]8[B;vSfgE .8>},>/}ac f}x)yerkYy4 ?ǣEp䋠o3^?I?}m2#fC9%zy( "\cda&zY^;?9+nj, 2R sHaFJ z JchR-~6de9ʃy8 Ju`A37+:F3VXa/dC=q~z-\p3To_-N8?m|DQP<"i] rI]/ P9Yf:9\9!J^%=zJA>Nk^{_&G^yԅTqJ}RZ'u$ A</3h8'o&q.ib>JsI:ԜִɮZXA'&yP.Iy~"$IN{*,cB٘,۾aG<_6QsH'F:)j!)D6Ï43NN'BTɼb O#ly.l57@O%ld1|lԼ&SRz>D)7:+¶ !֘Vsa`yeT2›",>܁^U1nFQa6V>Ϡz\/=ejrndk,op o"wL5CM>ɫhse\C6dừ,Z򴦳_pޗu1w|ov0-Bho?$i%4j }㭘m!rSu͑:[Ɠ؍ 8k!3_`ujGz}һ_/;K(nf $07sv,exލ=&HZ:b=Pyû5?hQ>  _}O!z 77'_6%G`ziN)8{&SRն}~ƼB]W4*U:n\~vH%5 mэGG(a.%Ha\1Z7-WT~*R$*JӘtݺĞK3Bӝh~Ǔ84򣟦[4>D CFBhi,Tj=!![ii̥GB `UXma udѦGCdAoǽp5x,SOEO.#Jjlt#,M]:ChC/CQ[Z~EȚZtxTmT-o |89NB(Nǃ){kNx8] w o wVq2Qb:t bGm-L,fV[g_(Oi`GkYS$b-zl}z# S[VnDY>`d܆q- Dliƶe6*_4{vy|@UEݷPXmv+XRr갶Z>]ÆhV;>w=Juׄ°gnQjK4VH. :uR2bF(>HIBvÆ$kӾƇDЗټċ 6NSP TG}ō>TdNo^{yNޕ|TxBnK_tL2 ;ȶ_žN)ԁrqha;X?i~DZOx|8Jv? 5rr/Jsw2k;s:y{A:g"{OаAA, ).B⢺bـǘoMpj<1ު(b)I$Hmh BLͻ^t^O&rPiG/:;@]lz: ï>;};U~ @8TfK z:Iܔ$( &i.j!{ab "A'znl~j~]"v9vUQvEIwAUt4ژm|h4B&KI'J@WzQۿ7ot@MFL;&69:ZQCѯ&.QF)##a %T1%f,U|Y=OzuV2wcNFC6 >, Q>C` Z8$Q捸,jEn4ū Zu^RQThRYySmJ|mA*TkIв/a+DUvnFjny|z_FyTp"x@ 2(iBq|FHQj4 A@eQ0;сJcMf٠q"u 0  `%\sX#k,#tOR" _a9 &($cX-RD߭JA',+OjbD6( jly#9hX@S,(.C XKc)0GESЄA2"A/OTsߔڄY_.. M0 -on;ϧfW7w)/ 5-x ?8y& Šo |x9+K\̙7= ZS- Ͽ(U*_,Kv{stEZmN3mxG#l*&fN5zѕ7\vS Cu˩&zY2A! F?U2>IQ.ZL\sjغdB)Ӏ. . V %MwLDA2M7 P΂y mH& y`JN~r?\oL P#u1DvA %]P(42\ rVQls2ð9J*&Zw`( (z67sv)DZBw W\n(nfwBA\ nǏR(we$16T6AנDj`4?b7̒7KFѕ ]2>I9o_09ˮoLILy_n^:Z6i&h&FMw%g74hX3,/0H;""k!+WT4۔[0 ȤC:Z9fѨ1CG*sG-$8&^￈؝H2Dkw>R`B\b7boM3529{pv{C(Ңшp`,@F6{w!+MktqiZZ]}wJty XdkagJ8!R\8y.+8ڝC0{z, )d_O3=2[^;w"k{;F*޽}A8̢WNA~[̮У"^>KzRNA |5l}%7Ngk HR"-RD;,Z:Ld&/,ͨU !,B ES̈́@ayZi2Cf3lٚaKnoF&WN4\ջW<~s#7q H])/SnQP¿ay{3jsgl"Pca3%$ǒ1C}Ɖ!BS2cs+uNa}Bت @[m [m0CI&l*ө^d g!;B%,#g"#"#dQn aly~(l/j?.ϓ ^"3"Mp҃_'h0.aD5~"JIk 22r$K!>{<Ko'YMp 9dԆ~Ѹ/78N8px~4YxbW6X#x߬%ᵗnGk!w͝e{9{eL(RTv[ZLI%ÐƏy+qwCw6p}ux:Mm;Xx^]őGXs4jIz 0y/sCL(|"볧~{H0jP t'nᓽre9^p"Z:+˹uO?΁-"}|ȳ3} O–/ȳ:%՟ j\OE!Cצ[ O_|/ZMAo jEW:;DjCuh\R2hBϺv4|Ԣ2v>.Zq j9YzƎ{PA(Lf*)̒LPrfYvc+M@!I4]q' n,mq_ϥ x8rxM>} 'fa5X2 xIPq#QRBPR';<5{QFC~ f!ALK#bgg]^%"/AF20װd))Ĩ. (`f1 I3on3isz*|:n0]-gm~u\n1CM(1s}Ӡ&*6Eȇg2I6_o0ٻ#ΗAdn (9+eS:Ͻsq¨'ִ2&ӈ٣t j Hl,"38JdߎLj=D23p%F;FHDB=21: S c*Z-U'j!ez L!kPQ0`J8,FU[M" ap R,e rXi! KTx[2&H% "y#Mp=9mV-#H2!M>OőRݙt0\GeE}(Cna-m[B-$Lh`e]^ -먱 0DB~v5LQ|r?LH3g~0(IHPY B\rbÄS;p3]YB *U3饸}nPG^(hI/mp[HdwiF a1 ԯVdW˵Be"&t4?zjFYg )p/LLdvԆ*YMqۮd ~9%0}yy&"$ [q9J1oaۊlb¦.b"Yz0{#f8$膬w.Gcݦ~ji~[F>g>( sfdב9Gj!'p9^|s|kΧ.f;pr\+gZ$+B['߮]N(Mnqe8ywV\ 1nDZE(]?h'7 /EG+,XK{p FG]zu6C:QqPK}D'y>茶) 3URyW Am~"|q_xNM)0 6y nt5-]ȓ77۟yw(k$0;3B"{l?|5s~6 ed^ W| >va Je4$m &);+=" ebwv_wc_|xKЉ9^?PÇtUGez]↡)!Ѵ{˳8 w*_=Fˌ .I+aM6L)ʃHTQܵ* -m]Z"@$&-ҫj!h(F]4|TZ{GQ[}T7=cƛ{o+ Λ~ p(Vk>f''8%@!S (HGmh,0#oK]Yo7c4( lm-4׹A\<äa=<C4dcw/m]ܺ [w!s뮘GӄdؐzojlTQ#ViGP39F_aXe෰w97ϛ vݛu'n b 6|6*a&GҬiqj1 bW,}<&H5J9g=a3BT9ޘdaR:cD!hb|m\q⊻W܅W}OO ͐qJ(h+ RDyP2`g 6OYwZg/d Ke!&T8r_4|1rF7.QwwE7/̄I kS OHHJ0k(jH&S&ޗu'/N~@+Fl>50&:'Ñ,QssM+6ȕ՜,Q9C6\*dW9ê=tu=NF8dݽ\B+Oβkp[Oz:Bε3&n?S_4`ƒACWh"sΥ1X#'EJ9ulf# $X3RBp0V l9%65$2qV}Z<}w5a\ ѲX!!.iЫ)4JuP%FHbbԦR#N9O'++#$)nK*D9M=uRgFS|jHRSi`[_r&W@ߌ{ aZY0RD@\$߉4A#t BXR]4Sl]~ X-7gLUoCirt">M(OSV*wByC@π=dž Lt9A=p I#C0U#kˆTsAMw og{wM"Jw^&K+7AV#BJ(aD!߀VDC $Y&6VGޖDRQ۱Q'JT>4ł!]"(I$.|ޖ Gő'J ,Sw1)Ѧ"?O o)!+͟s4m|DR1)J8ԺSôٱO Y"08"sX9ћ9F!?cf;֖ > ͌#op  ~#sgR?p5[.>O4/ݾŃ(ͯ0ŌGɦm&B{1 sucjˀ#87W9&y0jxƉ1H*c2綺t j Hl},hd`LPF[ASQgwHI1LJH9ycXE?x19TD!k9Wɥ^Rޫv q*BWP t :W_?uA?>,%勻Χ Љ mƿO>=|75l,!vR |Kق\wĎԁ&3$ 0ϓ,\d<'% &de!?fd ظW=|܃. &fL_Fa}b}蕔U](۪Iԫ⎑]3⬘jQo;COk1zC5bJa|y$|0xDůp!Nva}aIIT  {.=SFs>K簌L3x%ux x:%yX|¼ $1JwNkknGE嗭sv)~IV휤/sE@Hr&T6H](J@d35X@7@ho*&WZqa5r)351dL+FxFK iQ,kgWb ws#ߕnTW% |B,F )1  -}DT l("'L4ͥKiNO^xcmΠKAo^zs[(~Ho>+aTiFd$ʲɔ3`SpCy 0/o9 &t '|9ww.itz ^zpe2Kho$/]G"$NASS8H)^) Epf0d;).ErhaCpd 8r֜=%!iPUWoO (ꎴ{:&djE5 JQ3Ѩ '=$u9&yƬEj$9sG9x+Ӕ DLQ%t`teDYR_ލ?ڕmưaޗII1vKj}kz-;_$H_ ,+ ;^]L'YY=; pW,FfF95nh 6Ϝzn%uOf*lz`ʠl鯥u+UU+ cW+G[U`U3-o|F)-?/J(1 Ь#q0~ѳxpN$TfIW+v wդ;j)p:Sߤ,t#-5@2+Rar~FsؘN''-8Z-H®ݚ[Öf`ey])3N?W@9ۡΞccٞp|8ÒtQ?mkU^nvLQ3!O"a9VQ$5b8Ķʵh=z@!Fv ɂuels8!ӏto yV'$R;e4L/x:zϙh=*5Ӏշ2C]jVc71/> PVgցCT)LR.)5e,iC9G 9.2eijQDtG7A@1f71z4k~:,CCF\/ 1]u%STQ#N2",)H9FSqnxjRauYf4nf}m  h(yznqȀ2Js!@YZݠRFqĬ8υڥNi86 C)89¹E55rvٌXUέ [iP&&g8wa[R+> =T4C*jzs_2Lu9lCdsBUPz!h=GP E3 !Tw/{@ݛI wA˽ Xup]#%qz̅=~–Ob D JlyWŗxk+_V=:ߨeQlF]'kv˛44uzYzPX{–P$kxz?P_)@\/(d[ ^)F,F; bM.""TϷ?<xP;EZ`R3~{pwwSO.5|^HsdȑZh顦k)58pMnjczs mj"8BէnPtK/ 4\)q䋙ᾬ.KJ$wbbLlڮusx刞*8(ԓk -PӠq:۷aTW=8#W ÈV@. #̒N{b?Ch%?ګf=}U> [H.~|uK,XMkh7sӽkq-5('#|7p!X+lfL\@6)u8}uf:%W?WŖ+^.dioe_A@Q֏l~L\ٽ_$ѵ|t/]GG81B*&-u1Fhᬦ!92I#8u?7?j|.*:oRľ`QfdvmQEz~c< yˏ:3l R[gzdu_gh}ә/Z"壚XʞCkmg|7^/cJ>tcJ>t])UCtDpTqJ}#:\hEr4%(mMJlc_ec_^m w DCb1݊7@U _&!7C]^DAK8OJ@ : ["4yi.Fję15膤B3`+|_Pšf .{'WC$S&k~jp.{6voe^ } ܯ yvS;?'VT7%&r_J,Ix5o`MO]V$I.1U Ȳr$z_t WKv=5qY+$C'f0~xp^îc:ݚū/W_.]to,O#$7YpePA!ϬZ;~e$!UًcD2\I0]R7a`s$%p9RJJ+vW !קw`1kdW8^nHZvPт2r5(z!kMj/Ͱ㒽IBy,j$L7,RE]ءJ ˩TRQ%waJ*_%넇((ryN'8k˩ EbM!0G{|uW.͎qܘ&n[fFXgTzxYR }չ[nؕI@>4^bB*]T]XҌSj;( (5*mJvD#j&6 M$SD26%>^`bӚ²twڙK@q^X);v/;!/AvjdO!#%wuD6KY5 X^cGO}PJ4/޲7hdhQ'+ c)%GExdYPCS gɺ%ƶ~6Ku33o_dW?v4Ns;muOu< oP)cIFEBShLi<-Z.'{QǁGES!o=CtHpM=x*sbz-Y\AJ5TgD~^dS3 ½v{N,-?U7r֛$O7@!DQKw|mO#;یp̌L\]Œƹ2sa*mp3ht?+ h}V^ؔHI IJIci?2~<A᮪xƺtI7hydz7\u'kIJSẀ֢hPຳSY{jU!|8 b{*rʚH^mN;uO)<6N[OٻF$WdG~u {ذ/<5ku57HI)UHÍg/#8P+C7Z!\aه"W$U" 3쯀{,5\!n0`e(rf&*`Fpmd֌lp}s_/>$ ge nH{Q[ reww`>}!c+a:$-zXF)Wh92e,Du2itbim6>+h?DN?Lӫ?zOז[QCwW6' oγ2_C `s%=UY UF}MrȗM8E F!tt)V/`{%{?7H kC33¼&8 -f,G_M*ĕ̮NCqo.iCI~"E̋|ꢉA1VLh!2>(rG,.zl Ym`<8= ?L匫3d1y tsYfmľ䮖wN?[32 <)F^WFWqU{Uڤ̴,imK@H`ΑII^;GVf"2AlRFL!_HKd&!>a`3h.wz|7{zhI05BF4M0. s!aI,q ZYEQeMƁ \*Nə$$?]0̨W=S*R˞[-2)cCeIƎ4" 儅~q1KB eraY#!dJB)pb<{#ԧ1q0)%#YYG&;tu>f\8*O@;:` Hl^wk?+pc3!yd5mCTIdMYpZr G0`vɓ{"  g!xb%(sT&n%$Ȁ;ICZ<ZP1rUkk;ojuܼeO&ڡMɋ]P`lkUu|JgQЃo,Ko!(#;lf/1J6j>633io>JSUyǡrda,~qny bweRg`ʟnpޘA["٦FMt|Y _3,~L蓢n= Ḯb{CG)YD` QoY|1m'9л2q\eʯ.kP J5WH4'-n+$[-Ck*u9|Y#xlf{Cߠ= ،,lXɉlgP9h".OsmhX'ڼ6 ][z32-WhƒE2gQj2ٳޤ@Q@9FF>R&/[2/VEIN^ 1SJ%Í.IKiN|悎4^Hh5#A k3PJvzukG3V׮1b/ğʏm!׫[sk|&RN]_OG-+:L.귟u%^CO*}3wzRfeuǓ%|!dy$ӃNt;ɟƲ^e]J&9y;uأȒ#f1qJ-gR(_5*g+-?]9{ǣ£££:+g#ldJ6@Kj}(Mg7H/!aV mo]9E*g|kdY7jC2H6̾86ch["ige#kGE$ןKE%GeY)PbRcѫ냤Oub>ӌoCj @bTCEKɠ`(-b55G*\0*6T\R2CqzƭBK}Ldhk3=c\Y8W%M{3Мn?B*qf .ֱ2L ]wܱv35ҩ5iY'J[}"㝁XFU#\;NU"G[\V~-5x34 ³Nog@$TNxBmo/gHS?ggM:61|r;kP9i9s'7;Y.be<+1-a/谜U3Q ,]?t#嗣_z /8(hؔ?o/ oDcb,k\޲#m8Ezgk%WּtOвak ًcj)j'*)^y,U KWxL]N:*&7/"T$>o&u~"H;^9gQ%~/G^_?,dkqЫ_}<]U"}RaNBGy,"iaV"C15~'3ɊJ 2a+EqW΢=x Ŧ;͊@`6d+Y?|6ɲ%rF&%G%7QM]oO+gbr~yb_.8Ǘ]C|sWwؿ[]Ant\ҡ0ocKNPaKMwIsrڰ"Z 0.f\(b.Mct` eZpeQ!2INa%OGftuoY-HyB87<)sHeܱ߬ |-c5A_4.'u'~ӏdԸOg䫽%Z~+aͮZ_s-jHG Ph9*x#am#%03%S 7h9Q0$W(Bϔ↻2QA{R:՝ OK|ۧ7MVÖ82j>&ЩHbMGiZK_'1tǓԬTu@JX7MV-q3qfJɚvj;A}@#:z8*U[cWmu7-?^SV_T|S{ }`&iPlw%szŷwˎJe}aǮhX}޺BjazWȎտv\T׎ YINіoKiJ|#npV*7굌QGt߸j:zvޏdX[ qe$] >>;НQ6#Loy">6ae`↎ѹE.}e +kU45b'݁ߑ5}nK{~k ŀe<{ZEh٦Ycm(0ٮjӋO=c9r^C:$4B'pADR1$JHY0$u ,;}Q}_5[tݳ\hBBI' n3)lKxZܾmxC*')tdcsEFg|IF6Azkx1&KâТԼh}Su+m$GEbYś ۵A40d,$Yv}e/)jWeCcq1"z WoUWk\Tּ:kO<_X2ܨ5QԸǩOHiӢkzq"|8>NVEL|o=Jja/֣hxDT@* ǫRTüz%*˷^s_V&e9uŸ#8j2&($~]\o?5.PYk+f% όOSk~~(kI1-')3}@yE-^o#"*SK~zmʏON6/Xv]lRUYxZ1"W0=5{TW78_tP9A .- >5<@犹>j*iR^^K kf3;$zA3%}I|7 A:@|:U:ߢ;6F$^r m8U⑐EFNqS*$HA *O-'+3q_þ5r_B6N(UWY㢬 NwB7m0=w#J+_j> B9pւ ]9rb |IÝuSXWkfݷ>b$jNeчwMfռحsBb׋^w-tk/ܖⱉ |gFk~522ʓ{}k}Ym]udž~p=}zhrMSR E̼Hc9vvQX⦰hYo ѷSz gupiE|U7Z1xxS3}lN4՜*J"R$@B ;clhn1d= gL VL;#298 38 Q#zs$*B0@zTY@16$w(n2=gCVontWqnPJmr}"⩗/Q @3y( mUoEI}fPP+D~;ʒXe+dPUPL6_ȫ QˌiMRNeq>)HW>fڇ!rU넮8iM;GUZp eyJ+֠2 }b(x\8'pB < >8gܪ:d^QPNYys-OpuydC$.z`Cںl=Ұ%}V{u<ԵE;nu^Zd(M.zL4DJh"I%hwT""K^%E-3MI癖ׄќ/դ'PCމr tB56uojL@xLnӕԹ҃[+[+ + YfZDm(e A R[FqEcOoGLa3)RR]im! i!s&59$V>K>6 mݍFfb1S5XoT9y:Fȭu?+Bp\ I@oMU"|o#E⯟❎^fTٓg#Q;l)l0ɗ䋭߯I; <*4jy8qEbZ=K`&6r *uRZ1T }}x,$Jk 'oyyY. Io}9Q؄~#bwRNp-h7iK6xu-LZ D`ʕ-*G@csFt@kGD9"F5?b;;7'z_H .h"UniBu}16wI1d {Λzc֣O簾E|ţPcL ʐ2x+K|1dmU=$8_o$x C8qXSm@1Ʉ>k.}RS$0 }s?GJ_igPp|a#ksv!m ;.E9O/e/mlui~eS5Hmhµ:SK XRB)+X½~:ft,oͤVRB ]]|S}SOke3Sm;u=-<˜ktRy Ĭ궗+<# ٍT3LRfuw3g] udK~EޤqZFpBZ(rƦKl@DXH|] hSvJ_N=|l;oy/zCZx0{|d8pڳݪ]pH"K_IS_]f5Zbu2OķkD7}ѹ"ՑX{ĤJ19P1cJhO BJb)W5~Lb=h_^ vۻsp=b}t_>q0nV\#Eo6wvM~v1=b}[ʀ%vqWŊpH>͵ħ\ƥ{!TR9YUڒoO/1+kZ푴Zdh>G^<)0mGEw8Mk @3eA@Ь]\ 7, *28YBSI{~mՂfEYP HQIZJsqu!K󝶍&Rc?`p&b3VkJM}Z<к#v!)h]q/44LcGy[L.΁BRWLT&V-wkZW >8hfvzuB2I,iW a,3kp-WCd/ #T4k;:{:Ô*C~/;NXpEkR__Bb !OQiT> ,WϨ99ԆIY!+8W-bTj AWH[䒓 D*~g "SmIa0V4sACG)XiR>&pՒ>&| ܠ9*w >IT,ʾ;Ɔ&&\Cj?BUJ܀M߈ޠ֬'zoZ(0;`\V@f 5?/l_ %jږKf>J3`)?=RO=K/g0!wq-.I#RXA%6[el~M{鞑VJ5xGxG" jqa!%b^Y(F5X΃!B qU94#dӎd-G2((^@Qy"qc(8y~b7nE)D[bHx_<tpF_<fk)d&Kd8ӊLf`Od883LCrqĒLp{'2P߯e<-[~ &a72H\C0nkJuxrgs]|y$Zpcg^K>tZ)ɟ7hL*=(3+|]۬vRk2,+zL3ZֳmlY.w̓3=l%Jr:_m<f^p@>8Z;Z׾NMa`=rVXZ^xB|eJr%.OhM8h}ZL'|9r_6Yރws@mJ߿1ijcwV}xO'',h)IUĞTV RؚB*dUI\޼_e1?S(<Vf+Qԅ^[eUePkrT Z{ewWϛt#,8R}"mAW?H?@. ulũ([6,(]QZ65pm~cmAnĆ m@ßƖ *3:,~%}h5/l ]i;kҳQ3 nC3^j MG%z ז5xW*eS  *%~wm i34"g͐6Q>]?P@gnq|x,V ʕ9 2D,b™ZbZ J,C57p R*"*1_&~wGIe0'G\rHmD~''H!v*U ҂Is7ASfwU<ꤩ*iR^^@$/~g/&DVFIfPA"'dШ_O?S}֓^ߪM8uw=:^'o[I H6ÏpጺSH1Bp"m}44x8\$Ǔ'{KqDro;|?~[Y &څt->;h!2ˎB b9#fB7~>:}`?߈%sz GzO.;[;ۿl.Bч7x{gk۝5k}s=[?9}營~9_MozxW;G_Aag y .ׯ'^w좳v}Z藵{ϗ|vp;qNL(5t{S3Jg" x~~1D(Fe1z{1u{q 7;E٠ :烋o?v4ax]:g=K@V銍 _5f>䎧\c_w>AG?&eyD)~V??d)~NK4?;/\zw>:h-_6~;#N.0Oar?W=0ۡgӺv08ؿ89`0/r-}Lig=$p1Qi`\szM<r/\$.?-?f^oꪳˇaH!y#|~eΌ`^]oy{ol旷{EKӋ{g;?tpq5FӃ~$8K~; adJz_7:F7OY~:t#0'gp^%uv5[W/o8\a4N슙v*Sr};kgϣH5ȗpl`@B-(F!ڄᜍ|xÈޭ긑|XZ=G-}~瘜<헥 :S!9à]zqx1Cx0,4D, #FC â!z/PnHxT86}P# #UM8X.gaHN95(.$E {";1H{#j(VZU'ՇA[Ɩ7q;^8YYYY? yT\ِ:,7 2fEJJrJ%m-)s5ilJViGtJڲH;ѷxF3|-i+α;XCx1-=&Rf/ߐgY$qʮukZZEǠbj*SĔՎ:BVb1Q`bt z[1AM^b.KT[UrV3\Rh 1U*5DQ )Hj dLyoj[mj@ֺ֭ZZVkjs+|}cMG,L˘!6ӄ!iV$X(ly.%o\keQߵâFRM2?0ylOH4Au"G *d4`NfRGn &FC;:WYG:[g}o-%>6(Й؎*, < ʞ( &p>eQPDZaf> sQ,Nba6eƘ1D͜jK=Z, tD!p&0 Õ&N3O2`S45^N p[hɪ7bg\RE?Vz(nTT[Xb U"z[pRpJ_MTfMB9@-{1y\C19=NZ{SO~;`IN͂1-Quk{5/Bk%G 8I@0FVV .@Ej n/B _O*,<&g|דY*ۜY*ۜ6pBM$  1OOG:8,0dlZ\s0GZD)raiI} iUAa 3(.FJ/FkO wR'0KuT'0KubQe&UXm­8fe'%$ pU !ercG R#N(e.(pH# [H%+ڥ  ;Yd\ p%)SX$9x #4KiHE *@C0.nQpN@z1FC`ze`iz @=&.3~L.@hHɔ 8`RX(bq҄ꅠZnfзgx3)-LL12% /bB2m`!LA&20!^Q:F u:k,`FEȷ^i;K9Bd(4 ( W89@K8"*9 L{A\iF0(IF1itEa^٣}/ xn2+I;r QV 1s`1r }>Ȩ4Z:fJ¥6”0`c,zDV*GYh˝"P$ S50,nCfɶSU0)}obzT06̘H-L0-4 @2Y:!s1L[l-" 1esARPLF@d0`B鱧D[*GvO =MfFy|U[&W0\gj!4uJxe3T63Ke3T63+,ҰdkR"[l ,`)t,Ipe- s # 4x©9e;0A-j&ha=AXpDgTf3)|`Y|`Y󁕑o Z 4'hA`SKU}_tXم2.Vpw#) Mc殾vOrk 仧z{\cpV&.̥ (6M Q-Ys˛jhWeLoݳQs`}5vAqR0TXy5e``%'(GށXi`M5t~kO3*uOZpM[L|H &3LF&4 F6c"Lt) )")",!GQ5& $3H25b6p8JK$\^xf ik_SktN%%+-hA2PYI6Pa-@ORbɫUj(Us s0y ï3y֖c1 x7( b3_rek=FTQ5VU_޾T2,VSsQq—gMқ k*h\'6 c,%Fs5ZS/Ϛz7Wju%;D9 vM},CmbIƭ{Iܓ ň=Y]W=Ej8RRzn RQKқB'T{sWֱ t"YmRL$B9ҭcF|u&2"s9ٺ'GB/=lS2Iĭ?]HyZOơncᏃcg.}Hjr_^J#v|:gW)iy57>o}tɄ7xr0ai+lR2P-VT=q)N1JNJpn)IT/&e[rj_}ڗm˶e[}9ڗ2[L>;/TKʬ5P+pf.RgK*o χͤ|l, B"T} (U100oRzi . ZlZrU@ǖ@-j98P22 UIg@rjOOtڵUuJӤ,?#eU{]֣}υ(2&A+n0y &_S5B`R]U`BIp(lכJD]a0c-&E};wԧ(]\nlvO@I!1ɮ ?5KyқknGF8-;4DyvԨɒSni!EW/fяAq90$q(bE$'\RIYZo">RV-D2m`UXJUz{ j6 _`:B]̴LaU(]r7v%D:Bk#6f&vֱkWo"[=hTYMuu[z%^ЕS$@@s-r%_q[h UM7[^y]7-4p[ ^ δ-j$z{X&Xr V:PvnL(Efl%kZ FeJMr.ՃN~X\&usD:בĹK;au&\{}Ɗ6׋5XbT76{/mqI\2ڢZqBXXGhC|UgUQUɽew7 oڥ79Xbȿ=_]PQ󷴬ٕ%Gw/q0Ͽ}3sJa 5}},3f9ƿ{]{ί?ln>sNo$6;z,zNX2A#ӛ 4n5+%ErGfCuƆ~0l,>?`T..˿m?ߎi1@j좾X메BJË^P (jD[VMK6$0l1Nf&s[UrrȳTzdceVC[w7~9קD-A`@S-mP[mZͥ&B($UkN|JDIP1JV4`duj 5;y{UZXsqz @5j CB`v9,h6HhA+PWAյؘ"5A4l,fhfw75>0O/cfH2kpZ5UOOҢ 69JVg"7l"\ Xo!Ī&#u[hvq %Y{f;H1afOV }D1S`vLXZ)Fِ"ą0 {Sk04dه" ηccz &4#B(߫vN[^{_>H|Ajx4r̃-:hg&Yk*6r:H` Qz:BTTےaҩE)"!5'?Fgy /]S%C&fZ|UCP0dk bP;JDU:J)Q)'[+BA|c.dU Tiw7I4>𲌤IH!5v} EIT oA5ySb!h;!)Tk&5HRɖsA'2_I%|MG9 ;y{gX~7PS Э-)0e7PI2BNA729m61Ek+dX99 '.G乂hifL7JVhna͎aeCw*?oAzqe[];ҟi=BM?bL2J=^ 4+NĆQT}ۑ1mPl H|}/) zt ՅтKPF%? wX#r@ؘYPP<{ˉk}5Ag-xI!vaSxڑ1ƫ9a(gZ?HRt3"`9DGYBbA5In{Vy7&]fJj$@CZxzȘ MS3 0ƁBn "js [ ;$`'aڊC6﫨S07 cBdH޹vdLLcHح izj?B?eqA~>:XkLߜgs,_n܌E bg]_=vk?yob.6-0۶GoWoώ.~:~X\}7+fX‡?{TN"_}˓5/r]榟_oee ';E+{ LK--W9`5eMùЍVf7zYq9"BT|Z /SjT|Z>U-\!Ec%j͇~6ZL5EP.ŷ2Ph?*m⼥X M_mWvl1fOg~?̛rz]H&G2mn_ v/O/룆ֈ9Yj_,~#ː6JڪWE ΖTQ 1fs2V!Kbn>:Fmh{91!XyC)Nhxwk#J25#4 gvgҕ. 0v!UM}] ]6*Y޵A -DnH\6p*KqiJqrI$.[u/GI'CIF0@暛ME)Հo1: ,p9g=S= i/mΠnzXG‰\2kn+o(Gp@U[:Y=kߜfR͉LwR2۪1\+j2mt̎a^p;'ꊢmBT:0a(J9:X3l&Tٱ 5ahX/tW0 ˋ~,U ƯcaW {zR*\2.*%UWe0ӪLHs>%b͖"K6zbD+.(g>Щ*6ZY7@,Y0I,֭:ڭfuVSoǪSXsUntR2\-ln};2f@;g;jE}#wHAiPtA4CFV::"ކI|$PH5嵕 +'ηcchȠ?PܕՏR z]P?f38{c"!xs:_qN'(qc4?Ogx$XRvzؘb:53 jeKg0&͈] VHF ӌ!% aDO0%3k-hy2"|;6f0Hш4ِT&f#"]yko#-d?&H#mG& R ǝz o< iD9;Rf| <ϗKpx $j(:Yy." ~:zѿxqM]Ww}4DF T xM(JT|~:>h׏3Tu\ݜ/Cly=gLUzq}WuMRkzlдJlVa:zyt/Jԗ̏zoLT99k]3.껣ˋ;1wu$=Ӹ QG7PV:B6Ko\ijǝhdW/ޭq g>̌ d녎u;|5;C'"o*.j(ky=F'b7Ab 6:[F+cӚvmHNqԻaFcmWDmǵݿG{ϟǫ\)o.>f]l»^3@5׿ O ~vwNɖ L֍l=HN*\n3ɣCg9qCg9qValX$qcJ[CP/pNEx6!2jDXm {.ޙH"|}%DY./V"?>,`V=i;)#!_!]t>Q1+3FQ o7\D_!9AXA5@yxnNR )(aLMI͉ʤ)M)kq2rov>0gD*Yyj}ic<mDX O}K,o=:Y(ŽPCWZt`L1o&&zFh5rTr_ t]Xn%hY,Yfͳ̚gu֬9+<%*ʘrjV2 D)a˒!R рܮB.ޙ <֦UqCrk[OjҹxЩM_޺H(G{/&Cc{I5^,; yhe&w|uI(2PUR?,n|sOl@Ep7,nC$ ;Xɏ ba :[@f j}նrYk5R/FDha$BP'p&JrΛ$ p3* ԫPy- {q/k* (ǚV8^x.~JU-ay f`UC/_`t䷶e!Z}D+rCFټ2̐A."S` Gm}Q&}@-QdO/mYI#>&8T8Fׇ}oeЎNМVyHUtj"STv@#e>*JVx"`xBqj\ KyGJ%9="5՜`t.,x|Ȥ$K\ڭ2;9_&̼{ޕlTf6Ui:ws fyb#U *HV4P*ZmWgURʉ! hRM{ .ԫ*[Q׵<%r}%rP}%rC ¡x{&K^IU@Z =GѢfO$6ZU.2":Ya6C#uHAcaZ0W])z~ `/6ৱTqREy *[O؛ôuɥI4xy}5ۛG&i~jrMFpOt9b:k~ɹ"n'w˛{d%z˛ӓjFo8UON'an/&K3|:WFaKWFao5 [@h:QmvS.&boQR>[8Xֲ>v)9 %VjBIb)h(0$='nB\&eE<VGc>a?\]iS8A3D_1\qmҎL'6]|9:-b7ѯH5f +=%Zljiz_G{qOO$d/嚟E2(E4]D;󷥌rѩO$HN @մ@!jA4:ٖ[x\C4[8nA{JH<(Bj"Ym!3Zw{ƻyyL]y}׆j׽u5 ZcnUMk}o~<~E~Ōp]8r^]Iu \ '9 (K$x'cVC K~ &ŵ>BmP{c8T( лCNh#.~66B락Z~sWDs  "k"^뺁{v/k8>db }ϯ.+@%u}.r3e-Ջ(# dhr,EZg  Z8nz{*L<xD$d({IwP2n5 p[y$DS?@䔃S%LO"Qrg>,5%X.IL3Kɦ%l&@a3KVW@,̒|G?$ǴnNdƹBҙ%hp#$5 *A0Q:#l@l&VKD MFQL= 7%(Q,/f{63aB^,~hmrw_'I1bۦw`ݏʴ<忲V_^ꭶӇL.嵙;S+1)߶T]Cxr|mM)\~E450Zۦg(r(K#hu iMej*6xmKMGb6iԎ gzdgۦxM)SO޸9`ǢU<aEAEx&_ HH3PJ\Mc{9Pk2lApk:Z! NxjCN;]w+Mv}5u5.yjR0U ajcu5?_oG^F+2Dr~=#FsaU\,޿[|l~̺ [뻋vziSoP[ɀ}X'>jpo$]TEZ q|YgUWuaqM7Vܓe@h/:eDM+e ⠄ukm,GŘ(VdyXNYL%ɒ,vږ##Yn_UYxd_9ڛw^wB~pbSs\*qޭ ژNƻ}`A:ÿ́[z>,7g6-m ~Ğ+!zT VB&P%ZACqxiU!X;[;{-#veRHloimؖ@ nv+]xa`@c,Ѝ 6I7&zmgIm=j%'~@-vkI+r&[C3<V)<@ZrANps@-Tyg^"=p%~:{aFk#{`c-%Wre6qrdKv*t=aR@ cWb=n%GW˷!?{鵝;:kè-`kP̀;0j~'^ ;X|$w$jL!*6]?c\?)}Xnlȃ__x7#5?UmLgl>{N-ͻuһa!?v^8#|ݬn]e*sglQ/sI=Xn۔hYT'fOu{(4nM67ey+e1 ҁRu2n[/{:W;&L)՟8y-MCP 'Wͽ1,-\fi&|w}=Wf ziN\䋻 7?%x| $) n܂ؿ|gfF!/FkWK3Q*pyfsf8k1m2.ӻI9l3&r~g.bFϽ [0o&AjX{y|q/tm锳qO%֠I1cU`Z+Bt)U8{3n90ܕ+ mN'3 :bzEVk+ ?z4*%(OoXm{ivc[]8x2Yf~@#?uY?=;4aYum|lda|6ZҧTG25 O| ?Z8bGĿe!fTlY6,UBNXAE: ,I}Yh$ Pm s))ǐ zE'#wXp!P[GesƄTv%Kh 9DҐyr!N[z뺎b`5[{(md [6_A4qgdHDƞ$}%@0/gZ ulΞKm4t/#'NkV=LNkwȵpV6sdݜyu_F]ʚ$,*OTnSCIHJ'rI{W0ϧM-IL - i!d`R@ffⰒ!@/)?4㻊F[ %A!M }./)ZAeĐr='95a(v}68i& " EKDxuVE U+8nFoC.]Ĺ uRT 8q@H읇BO4o?KO=\sFףE >\e#Y{z ;{=cT77I]bb6:nD2zf'L=\Et)}G)\Md~p)O4Ϗo/ T Hl&s4 w9M}zO 6[h[;Nɞ{y&t*Pd%Qo/[!;[NgiܬzF'[";5'M(F$-Fiid}d* se,+3tf4M}hih $6+2Y^}EnŧA6>bh;vg/M*3JO>F ē9v~vSxcs$S429vs,dXΜ^g R |g^fO>|ʆXx1A3e&%/k]_0pg~o<~ -̅:XzL cY@,'2ReZj}(_5 ς勼_$Y]^̎o2o‹Uxko/ [aX72jD{F Hɼ(GyQD1ՏRd _ru<"(Ƒ=gHZdR?jpg+{z6K!rPЫhCmsxWרA_W9>l0BY }Mb<4c>K% %۝'wN߼Am)ڲ8wM xDbReZR0})\npgГֱ`ɢ_7c3$GLWW"HO ?ʝye]ze X{=am&笢5ԑEn2MK;|KGW^ǣ#\j/m_79S29(qzN(2o,t[M=hQ\^xhW9SH\1R1 ٗ-λ¾ȣf6/<. 4&d\SؗLIR<-Y*&UK(jk0_Ap8@mEh$NNɜ⚹\nշ;#7v6E Bq緷&qGs?swY(Еɓ-nq"Dy 4Q(ṫm2Z' ~9+,0m$)g*Ï0DހN~pW3FN4+_? kˮ\c \$-XY6fxwQ,Ȧo[#S _j_zYnZp,hƪçcZɯU amu1gSWEOk)ܗҺ/Gqzk=;EujY>/Gb12 i/EaFNhG߿bYhS,y-[r|s o'd29'fk d]`pvlRXZRb^J`YfE2GuFN98[^-(a5}=Y4V bUyGVrwYsZyȥ"'2hR4і8/ yWLjT,8J <|8l"6.nZ-Ca*s"MC+(kaGD(pء&jmmMD%\q-`e%z4H1C 5o=#/yEa|R6#׭jbDŽs(5|e 5Vn}[ Y>E o|Q~jI7|ѬE11K(HODci!d RT-=-_u7C%8gJ PiA7 ?@rgM-jlʲ~e\|eETlЁՇYUGgxȭP o=4pe̡PW5Å4g'lܻUdwŠu(rqrp YFe 93gE҂ `ak3ng ڨaZΛѦ"i<*BJϕ 7\V3G0NݞoGM ĬԆi#,Tq!j]Kq?fNFY#eŻt9m|@ܩܷ5*p-k!oI@8":;ϋUՃogxJ EM,VTaOt>NKp85)#͒fͯԄNii~ h{W6WzJP2lF[Y[ȵjR3)R7:YUM% 'fI-䚑M'Fb-&ɝkƨ#šOc0LBah@wP V7pS!:0̻T)vD4TƽD .&vYa|jQqe"`~O }, ܥ6D9Jr]=e!z"l]/H|-ƣFn ZEz{tYq&j^aHNǛ ܰf6EzmQ+#TK+M?lPhU>=/x) v[imA˴m.s"]0F|d : p X"l EA-D8=qa@ejN.*Ǖʕ$j*lv9,Z[/yR!bÉ{\;"1E7]#Hk7R`/k!W6\n331JK sLSw +!5yK]ѧLy qʥM¢b(ArP)]:[ox5S:+ ]4 eBQFQý7oSVXHbG-`)_Wm{^.QO>|cK"0ԉBx$,_5f ,c$,emT(@-Qa2;|PM1jMְ4z(Ԗd]PZvmqiePo ݫ"{R{:y;[HcٰBe)I@ebx6ӆKYWK"KYebBTꋻޖDR4sA>=<2ԁA:wBRm9(Bze-)@ ,'*hN}[/;~л~l1;„eggjZCMR(/|prPNp >f^bqWRCtV $~%ՅɛWG/פ@IXK_C<>pЉJV_sƴ"R'ѕϾAB|hv) @/۹[ʃM(qiHXA#>@D#$7IK58h@bL*a5\ u,!QQ ,̰҆߉&R 2kfF^q;:@e7-_έ(eGvi-4έCl;:}h)g¨Z.M73sxpΰ43y$fPDSW40) -4NR "F`D0k)]X[ teVa6'^a '4bGWg.=AItmKUyFmк.?n"Z7 rZO+U]^y&^ JŞjh9d+&}EңU+rT/w,Vb o(Ǐ`t\8 V^\1F-.Meu ڀRwG¤WvJ @ QBI_qK6C_>u2gic,fC^$a:S65!tWq@#p(q=S0^0V= Ø_sD2}7:lWX]z߫~˞5= g 4L^8{Md%r8/4{8٩D땑]#KGZdb#qY wj %k3b@ \]55J{<3O´&*kI*IY{|ûQYFޭ+MT3xGgQyozehȖ5R uJ6Tȫ;ֲɤNςM3FoEA؛?Z"6Z͟} 10177ms (11:51:52.015) Feb 18 11:51:52 crc kubenswrapper[4880]: Trace[1789403628]: [10.177114576s] [10.177114576s] END Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.015690 4880 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.016376 4880 trace.go:236] Trace[957177706]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 11:51:41.510) (total time: 10506ms): Feb 18 11:51:52 crc kubenswrapper[4880]: Trace[957177706]: ---"Objects listed" error: 10506ms (11:51:52.016) Feb 18 11:51:52 crc kubenswrapper[4880]: Trace[957177706]: [10.506126677s] [10.506126677s] END Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.016391 4880 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.020440 4880 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.020785 4880 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.020916 4880 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.023319 4880 trace.go:236] Trace[1706148569]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 11:51:41.983) (total time: 10039ms): Feb 18 11:51:52 crc kubenswrapper[4880]: Trace[1706148569]: ---"Objects listed" error: 10039ms (11:51:52.023) Feb 18 11:51:52 crc kubenswrapper[4880]: Trace[1706148569]: [10.039374573s] [10.039374573s] END Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.023354 4880 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.025201 4880 trace.go:236] Trace[1516322096]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 11:51:41.004) (total time: 11020ms): Feb 18 11:51:52 crc kubenswrapper[4880]: Trace[1516322096]: ---"Objects listed" error: 11020ms (11:51:52.024) Feb 18 11:51:52 crc kubenswrapper[4880]: Trace[1516322096]: [11.020875934s] [11.020875934s] END Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.025228 4880 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.047720 4880 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54030->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.047829 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54030->192.168.126.11:17697: read: connection reset by peer" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.048286 4880 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.048343 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.116698 4880 apiserver.go:52] "Watching apiserver" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.120065 4880 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.120326 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.120884 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.120939 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.120992 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.121004 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.121034 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.121192 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.121314 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.121454 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.121479 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.124011 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.124123 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.124192 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.125139 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.125204 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.125241 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.125342 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.125414 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.125415 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.131046 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 17:54:37.74675199 +0000 UTC Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.150280 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.161000 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.171590 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.180484 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.190264 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.199727 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.210325 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.219668 4880 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221580 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221625 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221647 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221688 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221710 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221903 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221909 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221932 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.221955 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222001 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222019 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222034 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222226 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222357 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222585 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222662 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222680 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222702 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222706 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222719 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222764 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222784 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222812 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222835 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222859 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222884 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222907 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222929 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222932 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222951 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222972 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.222977 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223059 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223084 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223158 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223181 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223098 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223209 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223221 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223269 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223298 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223323 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223346 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223374 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223391 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223393 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223396 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223453 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223475 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223492 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223508 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223521 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223524 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223536 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223588 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223618 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223599 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223660 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223694 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223699 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223739 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223754 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223769 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223810 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223841 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223843 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223857 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223870 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223908 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223982 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224552 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224632 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224672 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224699 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223930 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223941 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223959 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223974 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.223988 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224688 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224823 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224833 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224911 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224935 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225210 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224959 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225239 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225009 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225540 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225553 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224022 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.224877 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225656 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225773 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225817 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225843 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225865 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225887 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225906 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225927 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225948 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225966 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225987 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226015 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226035 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226247 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226272 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226301 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226319 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226341 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226364 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226383 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226407 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226429 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226450 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226468 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226492 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226519 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226539 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226566 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226585 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.225763 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226306 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226730 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226921 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226951 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226970 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.226996 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227019 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227026 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227037 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227059 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227080 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227131 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227183 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227206 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227227 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227291 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227314 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227337 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227358 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227479 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227505 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227525 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227556 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227577 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227598 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227641 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227663 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227722 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227742 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227761 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227899 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227550 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.227956 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228008 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228277 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228306 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228529 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228727 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228750 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228744 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.228781 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:51:52.728757294 +0000 UTC m=+20.157658145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228811 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228837 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228914 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228952 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.228977 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.229002 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.229027 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.229307 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.229370 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.229502 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.229705 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.230073 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.230241 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.230303 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.230335 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.230464 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.230532 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.230711 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.231322 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.231710 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.231924 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232002 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232123 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232410 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232445 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232561 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232064 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232691 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232741 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232748 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.232853 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233183 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233290 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233379 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233556 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233667 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233734 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233766 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233911 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233962 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233987 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234109 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234435 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234511 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.233953 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234696 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234651 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234776 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234812 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234842 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234756 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234876 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.234992 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235023 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235051 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235060 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235207 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235218 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235240 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235267 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235270 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235286 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235377 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235394 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235291 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235419 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235429 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235512 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235539 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235574 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235692 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235751 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235791 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236069 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236108 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236141 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236589 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236634 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235491 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236661 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235583 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235635 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235804 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235897 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.235774 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236535 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236556 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236581 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236758 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236842 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236954 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237381 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237417 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237418 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237436 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237471 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237546 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.236689 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237694 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237712 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237747 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237773 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237796 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237819 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237869 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237887 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237949 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237977 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237998 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238019 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238039 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238060 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238080 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238104 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238125 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238151 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238176 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238207 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238229 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238252 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238273 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238297 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238321 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238339 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238356 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238379 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238400 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238430 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238452 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238474 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238493 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238515 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237696 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237701 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.237923 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238109 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238150 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238372 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238475 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238542 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238545 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239165 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239230 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239703 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239838 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239404 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239964 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239442 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.239988 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240014 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240111 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240504 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240530 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240560 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240697 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240731 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240763 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240860 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.240958 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.238865 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241006 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241028 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241028 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241098 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241147 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241152 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241234 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241289 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241305 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241320 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241333 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241356 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241379 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241412 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241435 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241461 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241491 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241518 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241540 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241564 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241580 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241626 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241658 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241735 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241743 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241764 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241955 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.241976 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242028 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242066 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242048 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242176 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242204 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242227 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242252 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242257 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242277 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242180 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242303 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242328 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242353 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242376 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242455 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242469 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242483 4880 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242495 4880 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242513 4880 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242524 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242536 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242549 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242552 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242561 4880 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242596 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242626 4880 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242638 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242656 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242667 4880 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242681 4880 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242692 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242690 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242706 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242718 4880 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242729 4880 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242741 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242753 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242766 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242777 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242790 4880 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242801 4880 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242813 4880 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.242946 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.242976 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243035 4880 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.243037 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:52.742990189 +0000 UTC m=+20.171891120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243068 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243108 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243120 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243131 4880 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243143 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243153 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243163 4880 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243172 4880 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243181 4880 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243190 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243198 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243208 4880 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243219 4880 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243230 4880 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243239 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243250 4880 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243262 4880 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243276 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243288 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243302 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243315 4880 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243328 4880 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243341 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243351 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243360 4880 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243384 4880 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243394 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243404 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243414 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243423 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243435 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243446 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243456 4880 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243466 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243474 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243484 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243571 4880 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243680 4880 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243586 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243741 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243751 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243760 4880 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243768 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243777 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243787 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243798 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243809 4880 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243821 4880 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243831 4880 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243843 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243898 4880 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243911 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243908 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.243927 4880 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244038 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244041 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244053 4880 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244081 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244095 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244095 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244107 4880 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244126 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244138 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244150 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244161 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244172 4880 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244182 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244190 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244198 4880 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244207 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244216 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244225 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244235 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244243 4880 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244251 4880 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244259 4880 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244267 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244440 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244552 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244631 4880 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244641 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244649 4880 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244663 4880 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244672 4880 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244680 4880 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244688 4880 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244696 4880 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244706 4880 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244715 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244738 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244763 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244778 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244802 4880 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244812 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244829 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244889 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244909 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244983 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245188 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245394 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.245493 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.245544 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:52.745528787 +0000 UTC m=+20.174429648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245560 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245664 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.244821 4880 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245758 4880 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245767 4880 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245777 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245788 4880 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245800 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245814 4880 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245824 4880 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245833 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245843 4880 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245853 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245861 4880 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245871 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245881 4880 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245891 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245901 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245910 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245919 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245927 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245936 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245945 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245956 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245883 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245923 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.245923 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.246079 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.246216 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.246229 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.246271 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.246898 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.246926 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247062 4880 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247079 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247088 4880 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247098 4880 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247107 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247116 4880 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247125 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247135 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247144 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247152 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247170 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247179 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247193 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247211 4880 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247220 4880 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247227 4880 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247237 4880 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247247 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.247256 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.248550 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.253491 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.253632 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.253752 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.254029 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.254190 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.256102 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.257173 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.257200 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.257299 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.257378 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.260669 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.260693 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.260707 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.262936 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:52.762904516 +0000 UTC m=+20.191805387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.265018 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.267318 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.267970 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.268061 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.268149 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.268270 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:52.76824378 +0000 UTC m=+20.197144641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.268400 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.269859 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.271821 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.274050 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.281483 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.281874 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.283038 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.286749 4880 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e" exitCode=255 Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.287125 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e"} Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.307187 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.307957 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.311236 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.311955 4880 scope.go:117] "RemoveContainer" containerID="c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.314231 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.322839 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.336996 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.347924 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.347985 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348038 4880 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348054 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348067 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348073 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348082 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348128 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348143 4880 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348159 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348171 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348181 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348192 4880 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348205 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348130 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348216 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348228 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348242 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348255 4880 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348265 4880 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348275 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348283 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348324 4880 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348342 4880 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348352 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348361 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348370 4880 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348380 4880 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348389 4880 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348397 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348407 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348415 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348425 4880 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348433 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348441 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348449 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348457 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348465 4880 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348474 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348482 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348490 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348499 4880 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348507 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.348868 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.359903 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.375984 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.409923 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.436140 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.443660 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.449143 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 11:51:52 crc kubenswrapper[4880]: W0218 11:51:52.452172 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b1b0ac30ac277ce0a15678b756ddd25998e32702f73407be10becbc79103bd10 WatchSource:0}: Error finding container b1b0ac30ac277ce0a15678b756ddd25998e32702f73407be10becbc79103bd10: Status 404 returned error can't find the container with id b1b0ac30ac277ce0a15678b756ddd25998e32702f73407be10becbc79103bd10 Feb 18 11:51:52 crc kubenswrapper[4880]: W0218 11:51:52.477853 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-7e582d3492b675c104692a334466ee9c200f436c5e29c52dba76c797bb1ab32d WatchSource:0}: Error finding container 7e582d3492b675c104692a334466ee9c200f436c5e29c52dba76c797bb1ab32d: Status 404 returned error can't find the container with id 7e582d3492b675c104692a334466ee9c200f436c5e29c52dba76c797bb1ab32d Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.677568 4880 csr.go:261] certificate signing request csr-hflzz is approved, waiting to be issued Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.714915 4880 csr.go:257] certificate signing request csr-hflzz is issued Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.751580 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.751668 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.751707 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.751787 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.751832 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.751817396 +0000 UTC m=+21.180718257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.752188 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.752175955 +0000 UTC m=+21.181076816 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.752266 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.752302 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.752292588 +0000 UTC m=+21.181193449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.852803 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.853038 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853148 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853162 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853172 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853219 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.853206909 +0000 UTC m=+21.282107770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853236 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853268 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853281 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: E0218 11:51:52.853342 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:53.853323032 +0000 UTC m=+21.282223983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:52 crc kubenswrapper[4880]: I0218 11:51:52.999683 4880 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 18 11:51:52 crc kubenswrapper[4880]: W0218 11:51:52.999856 4880 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:52 crc kubenswrapper[4880]: W0218 11:51:52.999917 4880 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:52.999983 4880 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:52.999994 4880 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000010 4880 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000023 4880 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000040 4880 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000085 4880 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000057 4880 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000099 4880 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000039 4880 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000064 4880 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.000034 4880 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.131920 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 07:55:20.240113174 +0000 UTC Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.132264 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bfpgn"] Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.132537 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.135113 4880 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.135149 4880 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.141674 4880 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.141713 4880 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.142398 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.159369 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.182179 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.182773 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.183675 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.184385 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.185045 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.185597 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.186294 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.186913 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.187580 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.189794 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.190315 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.191542 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.192133 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.193170 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.193776 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.194581 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.194842 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.195498 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.195976 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.197331 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.198100 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.198710 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.199946 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.200450 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.201681 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.202151 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.203398 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.204316 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.204838 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.205736 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.206181 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.207175 4880 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.207302 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.209288 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.210207 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.210600 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.211199 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.212084 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.213082 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.213557 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.214586 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.215214 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.215648 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.216578 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.217492 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.218162 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.218956 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.219487 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.219955 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.220422 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.221241 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.222230 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.222938 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.223477 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.224624 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.225318 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.226382 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.233351 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.256161 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21c6371a-6404-4edc-9073-8b61c332a6f3-hosts-file\") pod \"node-resolver-bfpgn\" (UID: \"21c6371a-6404-4edc-9073-8b61c332a6f3\") " pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.256198 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwzl\" (UniqueName: \"kubernetes.io/projected/21c6371a-6404-4edc-9073-8b61c332a6f3-kube-api-access-tqwzl\") pod \"node-resolver-bfpgn\" (UID: \"21c6371a-6404-4edc-9073-8b61c332a6f3\") " pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.258959 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.269113 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.279983 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.290157 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1"} Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.290212 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb"} Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.290222 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7e582d3492b675c104692a334466ee9c200f436c5e29c52dba76c797bb1ab32d"} Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.291281 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495"} Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.291324 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ce6f1b0ba3e74d5619fa6c41ecb43d0cd4478534e71efc53edfe1785c78a99ec"} Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.293383 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.293695 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b1b0ac30ac277ce0a15678b756ddd25998e32702f73407be10becbc79103bd10"} Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.295327 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.297225 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd"} Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.308726 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.330231 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.346189 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.357161 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwzl\" (UniqueName: \"kubernetes.io/projected/21c6371a-6404-4edc-9073-8b61c332a6f3-kube-api-access-tqwzl\") pod \"node-resolver-bfpgn\" (UID: \"21c6371a-6404-4edc-9073-8b61c332a6f3\") " pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.357231 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21c6371a-6404-4edc-9073-8b61c332a6f3-hosts-file\") pod \"node-resolver-bfpgn\" (UID: \"21c6371a-6404-4edc-9073-8b61c332a6f3\") " pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.357287 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21c6371a-6404-4edc-9073-8b61c332a6f3-hosts-file\") pod \"node-resolver-bfpgn\" (UID: \"21c6371a-6404-4edc-9073-8b61c332a6f3\") " pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.368631 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.380153 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.391196 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.401673 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.413129 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.422234 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.431720 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.449032 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.485112 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.519740 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.532983 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.540565 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mh8wn"] Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.541069 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-c8jsp"] Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.541227 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.541450 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.544963 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.545027 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.545248 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.545368 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.545424 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-26dv6"] Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.545841 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.545907 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.546065 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.546065 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.546683 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.547206 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.547457 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.548009 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.548216 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.554404 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.561064 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.572559 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.580899 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.593373 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.602534 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.613315 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.625961 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.638514 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.646215 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.654997 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659703 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-hostroot\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659748 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqn9x\" (UniqueName: \"kubernetes.io/projected/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-kube-api-access-jqn9x\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659778 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-cnibin\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659795 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvtg4\" (UniqueName: \"kubernetes.io/projected/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-kube-api-access-cvtg4\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659811 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-cni-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659826 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-socket-dir-parent\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659841 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-kubelet\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659856 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-conf-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659883 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-cni-binary-copy\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659904 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-daemon-config\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659925 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-cni-bin\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659951 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-rootfs\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659970 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cnibin\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.659986 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cni-binary-copy\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660002 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bctkq\" (UniqueName: \"kubernetes.io/projected/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-kube-api-access-bctkq\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660022 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-os-release\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660041 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-system-cni-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660060 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-proxy-tls\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660075 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-cni-multus\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660090 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660105 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660119 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-k8s-cni-cncf-io\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660135 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-multus-certs\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660150 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-etc-kubernetes\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660169 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-mcd-auth-proxy-config\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660190 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-netns\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660209 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-system-cni-dir\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.660223 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-os-release\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.665624 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.683655 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.692664 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.701636 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.708494 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.718855 4880 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-18 11:46:52 +0000 UTC, rotation deadline is 2026-12-06 13:43:24.544667847 +0000 UTC Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.718944 4880 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6985h51m30.825727498s for next certificate rotation Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761297 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761363 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bctkq\" (UniqueName: \"kubernetes.io/projected/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-kube-api-access-bctkq\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761390 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-os-release\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761407 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-system-cni-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761423 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-proxy-tls\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761437 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761456 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-cni-multus\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761472 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-multus-certs\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761505 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-multus-certs\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761517 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-system-cni-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.761504 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:51:55.761480355 +0000 UTC m=+23.190381216 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761560 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-etc-kubernetes\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761583 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-mcd-auth-proxy-config\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761599 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761634 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-k8s-cni-cncf-io\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761653 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-system-cni-dir\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761667 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-netns\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761681 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-os-release\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761694 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-hostroot\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761717 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqn9x\" (UniqueName: \"kubernetes.io/projected/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-kube-api-access-jqn9x\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761734 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761750 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-cni-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761765 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-cnibin\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761780 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvtg4\" (UniqueName: \"kubernetes.io/projected/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-kube-api-access-cvtg4\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761795 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-conf-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761802 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-system-cni-dir\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761851 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-netns\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761860 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-os-release\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761891 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-os-release\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761914 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-hostroot\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761925 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-run-k8s-cni-cncf-io\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.762150 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.762187 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:55.762175905 +0000 UTC m=+23.191076766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762187 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762238 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-etc-kubernetes\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761545 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-cni-multus\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.761816 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-cni-binary-copy\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762293 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-socket-dir-parent\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762313 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-kubelet\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762334 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-cni-bin\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762353 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-daemon-config\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762378 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762397 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cnibin\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762411 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cni-binary-copy\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762431 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-rootfs\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762475 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-rootfs\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762507 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-socket-dir-parent\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762518 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-cni-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762525 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-kubelet\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762549 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-host-var-lib-cni-bin\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762555 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-cnibin\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762561 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-cni-binary-copy\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762624 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762641 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-conf-dir\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762677 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-mcd-auth-proxy-config\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.762718 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.762750 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:55.76274224 +0000 UTC m=+23.191643101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.762686 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cnibin\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.763016 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-multus-daemon-config\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.763227 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-cni-binary-copy\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.767322 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-proxy-tls\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.776925 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bctkq\" (UniqueName: \"kubernetes.io/projected/bf5fee6a-c0f1-43c5-8991-cc078ccb904d-kube-api-access-bctkq\") pod \"machine-config-daemon-c8jsp\" (UID: \"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\") " pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.777347 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvtg4\" (UniqueName: \"kubernetes.io/projected/3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8-kube-api-access-cvtg4\") pod \"multus-mh8wn\" (UID: \"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\") " pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.786485 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqn9x\" (UniqueName: \"kubernetes.io/projected/a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157-kube-api-access-jqn9x\") pod \"multus-additional-cni-plugins-26dv6\" (UID: \"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\") " pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.812084 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.838472 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.853830 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mh8wn" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.861821 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.865801 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.866088 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866268 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866355 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866443 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866531 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:55.866515457 +0000 UTC m=+23.295416318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866893 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866910 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866920 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:53 crc kubenswrapper[4880]: E0218 11:51:53.866959 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:55.866939648 +0000 UTC m=+23.295840509 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.867090 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.869428 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-26dv6" Feb 18 11:51:53 crc kubenswrapper[4880]: W0218 11:51:53.876113 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5fee6a_c0f1_43c5_8991_cc078ccb904d.slice/crio-bf0296792a99a6dcbcc49df3349cc13881fcdaa2fff73bbcc6a175c5144dd3e2 WatchSource:0}: Error finding container bf0296792a99a6dcbcc49df3349cc13881fcdaa2fff73bbcc6a175c5144dd3e2: Status 404 returned error can't find the container with id bf0296792a99a6dcbcc49df3349cc13881fcdaa2fff73bbcc6a175c5144dd3e2 Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.922998 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6jxd5"] Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.923882 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.924258 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.926991 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.927165 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.927259 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.927406 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.927533 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.927673 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.927673 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.941221 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.945274 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.950940 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.954521 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.955656 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967097 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-systemd-units\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967131 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-slash\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967147 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-node-log\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967163 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-netd\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967198 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967230 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-bin\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967247 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-systemd\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967261 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-env-overrides\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967278 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-netns\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967293 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-var-lib-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967309 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-log-socket\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967379 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-kubelet\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967411 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzfkz\" (UniqueName: \"kubernetes.io/projected/0ff5dd18-cbe8-4a79-9518-9786a3521131-kube-api-access-xzfkz\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967430 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967472 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-ovn\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967487 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovn-node-metrics-cert\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967523 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967565 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-config\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967585 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-script-lib\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.967620 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-etc-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.970429 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.980926 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:53 crc kubenswrapper[4880]: I0218 11:51:53.995855 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.012388 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.025759 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.040797 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.049678 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068547 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-systemd-units\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068630 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-slash\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068659 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-node-log\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068679 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-netd\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068714 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068747 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-bin\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068767 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-systemd\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068786 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-env-overrides\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068811 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-netns\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068832 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-var-lib-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068852 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-log-socket\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068872 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzfkz\" (UniqueName: \"kubernetes.io/projected/0ff5dd18-cbe8-4a79-9518-9786a3521131-kube-api-access-xzfkz\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068903 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-kubelet\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068923 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068952 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-ovn\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068971 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovn-node-metrics-cert\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.068995 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069027 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-config\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069050 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-script-lib\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069081 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-etc-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069148 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-etc-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069192 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-systemd-units\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069220 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-slash\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069250 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-node-log\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069279 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-netd\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069306 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069335 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-bin\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069365 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-systemd\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069551 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069591 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-netns\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069628 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-var-lib-openvswitch\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069651 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-log-socket\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069714 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069864 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069899 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-kubelet\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.069914 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-ovn\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.070028 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-env-overrides\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.070387 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-script-lib\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.074928 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-config\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.075098 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.075820 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovn-node-metrics-cert\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.088933 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzfkz\" (UniqueName: \"kubernetes.io/projected/0ff5dd18-cbe8-4a79-9518-9786a3521131-kube-api-access-xzfkz\") pod \"ovnkube-node-6jxd5\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.096215 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.109993 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.120141 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.122069 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.127281 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwzl\" (UniqueName: \"kubernetes.io/projected/21c6371a-6404-4edc-9073-8b61c332a6f3-kube-api-access-tqwzl\") pod \"node-resolver-bfpgn\" (UID: \"21c6371a-6404-4edc-9073-8b61c332a6f3\") " pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.132454 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 07:24:49.287245619 +0000 UTC Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.139144 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.150627 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.169052 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.179416 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.179487 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:54 crc kubenswrapper[4880]: E0218 11:51:54.179538 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.179493 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:54 crc kubenswrapper[4880]: E0218 11:51:54.179667 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:51:54 crc kubenswrapper[4880]: E0218 11:51:54.179746 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.183221 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.196283 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.215460 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.228541 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.244807 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.250582 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.263544 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: W0218 11:51:54.264078 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ff5dd18_cbe8_4a79_9518_9786a3521131.slice/crio-e1534cddb14cd97a3849504461d496f35a9380e39fee3d7d2767f1799fad0326 WatchSource:0}: Error finding container e1534cddb14cd97a3849504461d496f35a9380e39fee3d7d2767f1799fad0326: Status 404 returned error can't find the container with id e1534cddb14cd97a3849504461d496f35a9380e39fee3d7d2767f1799fad0326 Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.275098 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.301959 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"e1534cddb14cd97a3849504461d496f35a9380e39fee3d7d2767f1799fad0326"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.303319 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerStarted","Data":"afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.303361 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerStarted","Data":"4af2429a519ec48db39078dc697f1aafa0cf2e3c395b2ef8d80e806be26367ba"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.305251 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.305311 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.305329 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"bf0296792a99a6dcbcc49df3349cc13881fcdaa2fff73bbcc6a175c5144dd3e2"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.306511 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerStarted","Data":"4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.306590 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerStarted","Data":"9cf82e6a4914f97d7fc0a55e3b47476b449386235e8a4adc7cc23efe27d059d2"} Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.307113 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.315257 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.343142 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bfpgn" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.344082 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: W0218 11:51:54.354853 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c6371a_6404_4edc_9073_8b61c332a6f3.slice/crio-a42201867b6948579c9d5da820d051ccd68f9708f826819b578e768182bc5a3e WatchSource:0}: Error finding container a42201867b6948579c9d5da820d051ccd68f9708f826819b578e768182bc5a3e: Status 404 returned error can't find the container with id a42201867b6948579c9d5da820d051ccd68f9708f826819b578e768182bc5a3e Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.355540 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.410380 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.453279 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.514502 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.551960 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.573978 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.602317 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.658550 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.683564 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.730747 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.763317 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.804894 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.842403 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.887208 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.924075 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:54 crc kubenswrapper[4880]: I0218 11:51:54.966017 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:54Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.001932 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.047245 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.088794 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.123044 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.133129 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 20:15:30.964371907 +0000 UTC Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.312169 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888" exitCode=0 Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.312238 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888"} Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.314840 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace"} Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.317338 4880 generic.go:334] "Generic (PLEG): container finished" podID="a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157" containerID="afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599" exitCode=0 Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.317418 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerDied","Data":"afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599"} Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.319481 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bfpgn" event={"ID":"21c6371a-6404-4edc-9073-8b61c332a6f3","Type":"ContainerStarted","Data":"de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531"} Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.319536 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bfpgn" event={"ID":"21c6371a-6404-4edc-9073-8b61c332a6f3","Type":"ContainerStarted","Data":"a42201867b6948579c9d5da820d051ccd68f9708f826819b578e768182bc5a3e"} Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.330353 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.361750 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.376029 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.389688 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.406458 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.420924 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.436337 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.451537 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.485038 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.526389 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.567904 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.610376 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.644113 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.685419 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.725892 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.749666 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6zfpm"] Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.750204 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.765076 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.774758 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.787985 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:51:59.787955087 +0000 UTC m=+27.216855948 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.787827 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.788181 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.788350 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.788420 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:59.78840975 +0000 UTC m=+27.217310621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.789024 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.789116 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.789158 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:59.789150049 +0000 UTC m=+27.218050910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.795440 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.815938 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.835569 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.890297 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-host\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.890886 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.890915 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-serviceca\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.890951 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvl9\" (UniqueName: \"kubernetes.io/projected/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-kube-api-access-6dvl9\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.890986 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891156 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891174 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891188 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891252 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:59.891232891 +0000 UTC m=+27.320133752 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891306 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891317 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891325 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:55 crc kubenswrapper[4880]: E0218 11:51:55.891345 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:51:59.891339524 +0000 UTC m=+27.320240385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.893397 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.926844 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.965986 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:55Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.992131 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvl9\" (UniqueName: \"kubernetes.io/projected/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-kube-api-access-6dvl9\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.992196 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-host\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.992222 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-serviceca\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:55 crc kubenswrapper[4880]: I0218 11:51:55.992351 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-host\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.002927 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-serviceca\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.006181 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.033821 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvl9\" (UniqueName: \"kubernetes.io/projected/d6bd757b-e4ff-456b-8531-a8cdf2095d1b-kube-api-access-6dvl9\") pod \"node-ca-6zfpm\" (UID: \"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\") " pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.064306 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.103639 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.133531 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 02:13:34.323908046 +0000 UTC Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.147873 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.179637 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.179660 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.179714 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:56 crc kubenswrapper[4880]: E0218 11:51:56.179808 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:51:56 crc kubenswrapper[4880]: E0218 11:51:56.179897 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:51:56 crc kubenswrapper[4880]: E0218 11:51:56.179983 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.195079 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.224069 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.263788 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.264912 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6zfpm" Feb 18 11:51:56 crc kubenswrapper[4880]: W0218 11:51:56.276630 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bd757b_e4ff_456b_8531_a8cdf2095d1b.slice/crio-64e90aac1e58dd2f2b31974a4a6502954f22e53862f76d37728a066391b21c69 WatchSource:0}: Error finding container 64e90aac1e58dd2f2b31974a4a6502954f22e53862f76d37728a066391b21c69: Status 404 returned error can't find the container with id 64e90aac1e58dd2f2b31974a4a6502954f22e53862f76d37728a066391b21c69 Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.301836 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.326519 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerStarted","Data":"f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9"} Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.328532 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3"} Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.329928 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6zfpm" event={"ID":"d6bd757b-e4ff-456b-8531-a8cdf2095d1b","Type":"ContainerStarted","Data":"64e90aac1e58dd2f2b31974a4a6502954f22e53862f76d37728a066391b21c69"} Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.351189 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.384692 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.428430 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.465134 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.505956 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.547565 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.590869 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.627764 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.665304 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.708405 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.742057 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.791578 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.829665 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.864854 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.903394 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:56 crc kubenswrapper[4880]: I0218 11:51:56.940200 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:56Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.134518 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 17:40:09.267349998 +0000 UTC Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.336923 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad"} Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.336979 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585"} Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.337000 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4"} Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.337019 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616"} Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.337049 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9"} Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.338000 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6zfpm" event={"ID":"d6bd757b-e4ff-456b-8531-a8cdf2095d1b","Type":"ContainerStarted","Data":"748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67"} Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.340567 4880 generic.go:334] "Generic (PLEG): container finished" podID="a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157" containerID="f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9" exitCode=0 Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.340638 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerDied","Data":"f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9"} Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.360404 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.374658 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.388838 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.400818 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.414636 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.427208 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.449949 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.463903 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.478207 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.487835 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.496154 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.511941 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.525455 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.540552 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.555054 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.589312 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.625121 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.672224 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.703515 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.744256 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.795316 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.826304 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.862506 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.902217 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.946561 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:57 crc kubenswrapper[4880]: I0218 11:51:57.984484 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:57Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.024137 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.062601 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.107704 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.134652 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 18:36:16.924558352 +0000 UTC Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.144944 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.179279 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.179415 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.179555 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.179737 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.179821 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.179883 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.347481 4880 generic.go:334] "Generic (PLEG): container finished" podID="a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157" containerID="1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c" exitCode=0 Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.347573 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerDied","Data":"1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.370473 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.383971 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.398586 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.399073 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.402472 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.412159 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.421637 4880 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.423246 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.423278 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.423289 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.423416 4880 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.425405 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.429914 4880 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.430288 4880 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.431780 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.431846 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.431865 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.431897 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.431919 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.442111 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.447717 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.451641 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.451669 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.451680 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.451696 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.451706 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.464394 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.468811 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.468868 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.468881 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.468896 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.468905 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.471126 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.486860 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.491296 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.491352 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.491366 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.491383 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.491396 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.504578 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.507223 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.512478 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.512529 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.512541 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.512571 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.512587 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.526918 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: E0218 11:51:58.527075 4880 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.528637 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.528667 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.528677 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.528690 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.528700 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.545892 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.593756 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.623668 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.631347 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.631377 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.631388 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.631406 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.631419 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.663184 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.701493 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.733449 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.733492 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.733504 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.733521 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.733533 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.745357 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.813162 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.836086 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.836125 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.836137 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.836155 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.836169 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.857326 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.871453 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.906218 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.939064 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.939107 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.939126 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.939146 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.939158 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:58Z","lastTransitionTime":"2026-02-18T11:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.943465 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:58 crc kubenswrapper[4880]: I0218 11:51:58.983777 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:58Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.028385 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.041651 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.041689 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.041700 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.041717 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.041728 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.063684 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.102897 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.135533 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 21:47:09.309390668 +0000 UTC Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.142955 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.144544 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.144571 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.144582 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.144596 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.144624 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.186730 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.224016 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.247505 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.247550 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.247559 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.247574 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.247587 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.266216 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.315829 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.347850 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.350091 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.350142 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.350161 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.350184 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.350203 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.354798 4880 generic.go:334] "Generic (PLEG): container finished" podID="a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157" containerID="5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132" exitCode=0 Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.354907 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerDied","Data":"5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.363000 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.386764 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.426251 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.453866 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.453901 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.453912 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.453930 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.453941 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.463945 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.506292 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.545689 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.557144 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.557182 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.557192 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.557211 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.557222 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.586306 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.625736 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.660561 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.660861 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.660872 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.660888 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.660897 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.678506 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.714053 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.744644 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.763309 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.763397 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.763410 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.763428 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.763440 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.781840 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.822725 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.829981 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.830127 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:52:07.830100909 +0000 UTC m=+35.259001770 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.830179 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.830256 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.830353 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.830391 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.830420 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:07.830405528 +0000 UTC m=+35.259306389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.830438 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:07.830431669 +0000 UTC m=+35.259332530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.865640 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.865684 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.865694 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.865706 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.865717 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.870159 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.904519 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.930969 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.931029 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931163 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931180 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931193 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931204 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931207 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931219 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931266 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:07.931250437 +0000 UTC m=+35.360151298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:59 crc kubenswrapper[4880]: E0218 11:51:59.931286 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:07.931278108 +0000 UTC m=+35.360178969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.945354 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.967833 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.967867 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.967876 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.967889 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.967898 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:51:59Z","lastTransitionTime":"2026-02-18T11:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:51:59 crc kubenswrapper[4880]: I0218 11:51:59.983349 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:51:59Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.070866 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.070902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.070911 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.070924 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.070932 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.136286 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 14:56:26.13864043 +0000 UTC Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.173553 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.173622 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.173636 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.173652 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.173661 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.178896 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.178923 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.178939 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:00 crc kubenswrapper[4880]: E0218 11:52:00.179013 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:00 crc kubenswrapper[4880]: E0218 11:52:00.179089 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:00 crc kubenswrapper[4880]: E0218 11:52:00.179167 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.276806 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.276847 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.276857 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.276874 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.276884 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.368181 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerStarted","Data":"a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.379183 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.379225 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.379238 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.379260 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.379273 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.383030 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.396646 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.409877 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.422068 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.435421 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.453458 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.468224 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.480593 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.480972 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.480999 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.481008 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.481022 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.481031 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.490842 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.499053 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.516578 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.528424 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.539443 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.549430 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.583430 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.583463 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.583472 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.583486 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.583495 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.590868 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:00Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.686010 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.686054 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.686067 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.686083 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.686094 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.788495 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.788532 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.788540 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.788558 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.788576 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.890790 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.890843 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.890857 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.890874 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.890886 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.993572 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.993646 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.993658 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.993672 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:00 crc kubenswrapper[4880]: I0218 11:52:00.993681 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:00Z","lastTransitionTime":"2026-02-18T11:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.095555 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.095594 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.095620 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.095639 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.095650 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.136944 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 21:56:41.453296584 +0000 UTC Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.197356 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.197388 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.197397 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.197410 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.197425 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.299521 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.299549 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.299561 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.299574 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.299583 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.372844 4880 generic.go:334] "Generic (PLEG): container finished" podID="a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157" containerID="a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025" exitCode=0 Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.372890 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerDied","Data":"a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.378211 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.378474 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.385363 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.401797 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.401822 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.401831 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.401845 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.401856 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.407193 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.407236 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.421776 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.435729 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.447504 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.465761 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.486035 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.500058 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.503549 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.503600 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.503627 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.503647 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.503656 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.510741 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.521642 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.533947 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.551328 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.563373 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.575010 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.585699 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.594367 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.605899 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.605936 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.605945 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.605959 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.605967 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.610855 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.629374 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.641319 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.652458 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.665463 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.676756 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.687206 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.698386 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.708124 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.708166 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.708180 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.708197 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.708207 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.708853 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.724315 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.753152 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.765760 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.778230 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.790780 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:01Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.810744 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.810780 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.810789 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.810803 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.810816 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.913029 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.913069 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.913078 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.913090 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:01 crc kubenswrapper[4880]: I0218 11:52:01.913099 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:01Z","lastTransitionTime":"2026-02-18T11:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.015925 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.015967 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.015979 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.015998 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.016016 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.117951 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.117994 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.118007 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.118024 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.118035 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.137198 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 08:21:13.103774958 +0000 UTC Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.178708 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.178745 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.178834 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:02 crc kubenswrapper[4880]: E0218 11:52:02.178930 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:02 crc kubenswrapper[4880]: E0218 11:52:02.179042 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:02 crc kubenswrapper[4880]: E0218 11:52:02.179140 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.222513 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.222593 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.222646 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.222670 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.222689 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.326019 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.326052 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.326061 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.326073 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.326082 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.344714 4880 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.383417 4880 generic.go:334] "Generic (PLEG): container finished" podID="a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157" containerID="ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367" exitCode=0 Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.383456 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerDied","Data":"ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.383550 4880 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.384041 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.396948 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.408524 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.411290 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.428533 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.428601 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.428639 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.428663 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.428678 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.439327 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.453474 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.476667 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.491314 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.505921 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.519256 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.530754 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.530784 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.530794 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.530806 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.530817 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.532733 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.544156 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.555754 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.567859 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.580791 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.593792 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.607382 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.620020 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.630282 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.632566 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.632599 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.632625 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.632642 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.632653 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.647975 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.660643 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.679371 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.691351 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.705862 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.722425 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.735840 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.735878 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.735889 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.735902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.735911 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.761755 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.801630 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.815784 4880 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.838397 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.838443 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.838459 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.838481 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.838501 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.864482 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.903969 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.941079 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.941117 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.941129 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.941147 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.941158 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:02Z","lastTransitionTime":"2026-02-18T11:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.943401 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:02 crc kubenswrapper[4880]: I0218 11:52:02.984512 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:02Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.024464 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.043645 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.043680 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.043691 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.043708 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.043720 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.064390 4880 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.138318 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 10:57:27.41074717 +0000 UTC Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.147011 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.147033 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.147042 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.147099 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.147111 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.218270 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.244882 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.249336 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.249382 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.249394 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.249413 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.249425 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.259237 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.271518 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.283910 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.296194 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.322889 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.351352 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.351395 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.351406 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.351421 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.351430 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.361269 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.388514 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" event={"ID":"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157","Type":"ContainerStarted","Data":"da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.388578 4880 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.399849 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.448682 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.453753 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.453794 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.453806 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.453823 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.453833 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.480770 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.528913 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.556862 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.556906 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.556917 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.556931 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.556940 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.566746 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.602037 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.642430 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.658602 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.658654 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.658668 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.658685 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.658693 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.689992 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.723189 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.760573 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.760672 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.760686 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.760703 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.760713 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.765993 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.805593 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.844532 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.863540 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.863581 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.863589 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.863603 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.863630 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.883098 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.924526 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.965115 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:03Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.965949 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.965974 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.965983 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.965998 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:03 crc kubenswrapper[4880]: I0218 11:52:03.966007 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:03Z","lastTransitionTime":"2026-02-18T11:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.005749 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.044047 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.068104 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.068141 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.068151 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.068165 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.068174 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.082483 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.122902 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.139352 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 14:11:25.008117772 +0000 UTC Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.161541 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.169832 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.169871 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.169883 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.169900 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.169912 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.179324 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.179388 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.179324 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:04 crc kubenswrapper[4880]: E0218 11:52:04.179467 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:04 crc kubenswrapper[4880]: E0218 11:52:04.179520 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:04 crc kubenswrapper[4880]: E0218 11:52:04.179567 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.208187 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.242903 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:04Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.271945 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.271976 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.271985 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.271998 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.272006 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.373995 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.374045 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.374059 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.374077 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.374113 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.391635 4880 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.476553 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.476862 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.476874 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.476890 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.476905 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.579562 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.579637 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.579650 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.579668 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.579680 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.682006 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.682053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.682068 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.682093 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.682105 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.784414 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.784472 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.784482 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.784499 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.784509 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.888119 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.888169 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.888185 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.888204 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.888215 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.991285 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.991356 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.991380 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.991411 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:04 crc kubenswrapper[4880]: I0218 11:52:04.991434 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:04Z","lastTransitionTime":"2026-02-18T11:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.093697 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.093771 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.093784 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.093802 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.093815 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.139665 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 06:09:55.468563067 +0000 UTC Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.195436 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.195467 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.195477 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.195491 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.195502 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.297421 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.297486 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.297504 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.297528 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.297545 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.327162 4880 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.398648 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/0.log" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.399829 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.399870 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.399882 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.399901 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.399913 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.402540 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7" exitCode=1 Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.402585 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.406897 4880 scope.go:117] "RemoveContainer" containerID="e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.422592 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.428706 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv"] Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.429173 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.430793 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.431435 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.451953 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:04Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0218 11:52:04.634507 6141 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:04.634528 6141 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:04.634560 6141 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634661 6141 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634676 6141 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.634915 6141 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635074 6141 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635090 6141 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635139 6141 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635229 6141 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.465871 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.481359 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.483988 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98sjq\" (UniqueName: \"kubernetes.io/projected/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-kube-api-access-98sjq\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.484100 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.484200 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.484273 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.497702 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.502436 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.502510 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.502528 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.502545 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.502556 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.513275 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.526441 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.545054 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.561728 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.575902 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.585115 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.585183 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.585214 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98sjq\" (UniqueName: \"kubernetes.io/projected/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-kube-api-access-98sjq\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.585252 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.586078 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.586170 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.590334 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.591162 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.601742 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.602011 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98sjq\" (UniqueName: \"kubernetes.io/projected/fe51086c-95bb-47e7-a56d-4bb9437c7b1c-kube-api-access-98sjq\") pod \"ovnkube-control-plane-749d76644c-q4njv\" (UID: \"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.605700 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.605746 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.605758 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.605778 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.605798 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.626441 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.639872 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.654590 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.666449 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.681405 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.692357 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.702797 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.708505 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.708555 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.708566 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.708581 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.708589 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.713973 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.726059 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.738232 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.753369 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.755039 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.767550 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: W0218 11:52:05.767769 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe51086c_95bb_47e7_a56d_4bb9437c7b1c.slice/crio-cedd375240a48ca0657962429933677861156472819620f1cc488a2a274c6892 WatchSource:0}: Error finding container cedd375240a48ca0657962429933677861156472819620f1cc488a2a274c6892: Status 404 returned error can't find the container with id cedd375240a48ca0657962429933677861156472819620f1cc488a2a274c6892 Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.783388 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.807585 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.811180 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.811206 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.811214 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.811227 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.811237 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.823518 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.835126 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.844645 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.854153 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.873851 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:04Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0218 11:52:04.634507 6141 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:04.634528 6141 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:04.634560 6141 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634661 6141 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634676 6141 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.634915 6141 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635074 6141 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635090 6141 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635139 6141 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635229 6141 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:05Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.913224 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.913267 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.913277 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.913291 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:05 crc kubenswrapper[4880]: I0218 11:52:05.913300 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:05Z","lastTransitionTime":"2026-02-18T11:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.015794 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.015831 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.015840 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.015853 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.015861 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.118166 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.118489 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.118500 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.118516 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.118528 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.139962 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 22:36:30.404676567 +0000 UTC Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.179271 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.179305 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.179337 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:06 crc kubenswrapper[4880]: E0218 11:52:06.179395 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:06 crc kubenswrapper[4880]: E0218 11:52:06.179486 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:06 crc kubenswrapper[4880]: E0218 11:52:06.179560 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.221103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.221133 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.221142 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.221175 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.221187 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.323414 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.323464 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.323474 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.323488 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.323497 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.383139 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.407890 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.408530 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/0.log" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.412046 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.412256 4880 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.414064 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" event={"ID":"fe51086c-95bb-47e7-a56d-4bb9437c7b1c","Type":"ContainerStarted","Data":"f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.414115 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" event={"ID":"fe51086c-95bb-47e7-a56d-4bb9437c7b1c","Type":"ContainerStarted","Data":"b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.414129 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" event={"ID":"fe51086c-95bb-47e7-a56d-4bb9437c7b1c","Type":"ContainerStarted","Data":"cedd375240a48ca0657962429933677861156472819620f1cc488a2a274c6892"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.421473 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.425939 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.425969 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.425979 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.425995 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.426006 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.436601 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.446259 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.466724 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.488075 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:04Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0218 11:52:04.634507 6141 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:04.634528 6141 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:04.634560 6141 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634661 6141 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634676 6141 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.634915 6141 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635074 6141 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635090 6141 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635139 6141 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635229 6141 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.507573 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.521524 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.528069 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.528109 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.528121 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.528137 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.528166 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.536110 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.545956 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.557438 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.567828 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.578004 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.589042 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.642725 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.642762 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.642778 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.642794 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.642808 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.646713 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.659781 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.676834 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.692643 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.706961 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.716406 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.725839 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.744857 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:04Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0218 11:52:04.634507 6141 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:04.634528 6141 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:04.634560 6141 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634661 6141 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634676 6141 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.634915 6141 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635074 6141 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635090 6141 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635139 6141 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635229 6141 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.745033 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.745076 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.745087 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.745103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.745114 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.757841 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.768198 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.779323 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.789170 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.800559 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.811357 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.826365 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.846157 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.847449 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.847485 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.847498 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.847515 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.847540 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.858940 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.871993 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:06Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.949694 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.949730 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.949740 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.949753 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:06 crc kubenswrapper[4880]: I0218 11:52:06.949762 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:06Z","lastTransitionTime":"2026-02-18T11:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.053140 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.053186 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.053197 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.053211 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.053220 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.140364 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 20:57:13.318296574 +0000 UTC Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.155766 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.155807 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.155816 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.155832 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.155842 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.258289 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.258380 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.258404 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.258439 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.258462 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.275196 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nj7dq"] Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.275733 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.275834 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.286179 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.303545 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:04Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0218 11:52:04.634507 6141 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:04.634528 6141 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:04.634560 6141 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634661 6141 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634676 6141 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.634915 6141 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635074 6141 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635090 6141 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635139 6141 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635229 6141 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.315374 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.327817 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.342888 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.346647 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drpqb\" (UniqueName: \"kubernetes.io/projected/c6981df1-6d75-41e2-a41e-ac960f0a847a-kube-api-access-drpqb\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.346692 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.356248 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.361349 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.361385 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.361395 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.361416 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.361428 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.373349 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.395270 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.411881 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.420247 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/1.log" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.421088 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/0.log" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.423968 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac" exitCode=1 Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.424074 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.424215 4880 scope.go:117] "RemoveContainer" containerID="e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.425319 4880 scope.go:117] "RemoveContainer" containerID="96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac" Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.425592 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.430666 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.448489 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drpqb\" (UniqueName: \"kubernetes.io/projected/c6981df1-6d75-41e2-a41e-ac960f0a847a-kube-api-access-drpqb\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.448592 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.449014 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.449102 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:07.94907678 +0000 UTC m=+35.377977651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.454654 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.465128 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.465434 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.465533 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.465737 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.465843 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.471383 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.474734 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drpqb\" (UniqueName: \"kubernetes.io/projected/c6981df1-6d75-41e2-a41e-ac960f0a847a-kube-api-access-drpqb\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.492563 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.522995 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.536028 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.549011 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.565491 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.568709 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.568828 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.568898 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.569043 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.569129 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.584347 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.629976 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:04Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0218 11:52:04.634507 6141 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:04.634528 6141 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:04.634560 6141 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634661 6141 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634676 6141 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.634915 6141 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635074 6141 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635090 6141 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635139 6141 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635229 6141 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"message\\\":\\\"factory.go:141\\\\nI0218 11:52:06.909429 6305 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909547 6305 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909701 6305 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:06.910017 6305 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:52:06.910039 6305 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:52:06.910058 6305 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:06.910064 6305 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:06.910078 6305 factory.go:656] Stopping watch factory\\\\nI0218 11:52:06.910089 6305 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:52:06.910116 6305 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:06.910124 6305 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:52:06.910131 6305 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.666189 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.673243 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.673801 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.673892 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.673974 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.674204 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.708960 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.749338 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.777674 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.777756 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.777779 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.777813 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.777832 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.789411 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.824162 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.851575 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.851847 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:52:23.851799198 +0000 UTC m=+51.280700069 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.852008 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.852075 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.852190 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.852253 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:23.852243309 +0000 UTC m=+51.281144180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.852272 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.852402 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:23.852375513 +0000 UTC m=+51.281276414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.866217 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.881725 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.881805 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.881831 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.881868 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.881893 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.908472 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.946594 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.953287 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.953385 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.953427 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.953654 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.953692 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.953711 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.953807 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:23.953777206 +0000 UTC m=+51.382678067 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.953916 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.954068 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:08.954026414 +0000 UTC m=+36.382927275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.954255 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.954351 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.954432 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:07 crc kubenswrapper[4880]: E0218 11:52:07.954577 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:23.954553258 +0000 UTC m=+51.383454129 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.984658 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.984876 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.985014 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.985142 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.985297 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:07Z","lastTransitionTime":"2026-02-18T11:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:07 crc kubenswrapper[4880]: I0218 11:52:07.992977 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:07Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.022860 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.068395 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.088919 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.088990 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.089002 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.089019 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.089047 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.110196 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.140915 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 05:35:29.489686217 +0000 UTC Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.159716 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.179584 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.179653 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.179591 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.179755 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.179839 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.179929 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.191391 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.191671 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.191760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.191847 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.191939 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.192318 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.222335 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.294171 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.294416 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.294512 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.294595 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.294716 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.396965 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.397018 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.397028 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.397043 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.397061 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.428520 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/1.log" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.499299 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.499354 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.499367 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.499384 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.499396 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.602135 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.602206 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.602221 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.602241 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.602255 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.704838 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.704878 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.704887 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.704902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.704911 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.717065 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.717125 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.717135 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.717149 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.717158 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.728916 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.735659 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.735706 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.735743 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.735768 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.735786 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.748214 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.751654 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.751701 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.751716 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.751736 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.751752 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.764648 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.767983 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.768023 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.768045 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.768060 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.768072 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.779991 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.783772 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.783822 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.783834 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.783851 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.783895 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.795183 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:08Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.795299 4880 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.806632 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.806733 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.806796 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.806862 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.806920 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.908585 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.908650 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.908662 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.908677 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.908688 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:08Z","lastTransitionTime":"2026-02-18T11:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:08 crc kubenswrapper[4880]: I0218 11:52:08.963127 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.963259 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:08 crc kubenswrapper[4880]: E0218 11:52:08.963320 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:10.963302372 +0000 UTC m=+38.392203233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.011497 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.011535 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.011545 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.011560 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.011570 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.113989 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.114025 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.114035 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.114050 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.114059 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.141447 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 23:56:57.196469821 +0000 UTC Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.179211 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:09 crc kubenswrapper[4880]: E0218 11:52:09.179360 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.216127 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.216171 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.216182 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.216198 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.216210 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.319215 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.319468 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.319534 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.319665 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.319745 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.422522 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.422588 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.422632 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.422652 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.422664 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.524927 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.524967 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.524978 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.524992 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.525002 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.628059 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.628107 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.628120 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.628137 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.628149 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.730502 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.730568 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.730582 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.730599 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.730630 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.833133 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.833215 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.833231 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.833295 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.833314 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.935533 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.935565 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.935574 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.935587 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:09 crc kubenswrapper[4880]: I0218 11:52:09.935596 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:09Z","lastTransitionTime":"2026-02-18T11:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.037456 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.037540 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.037576 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.037597 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.037649 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.140459 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.140789 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.140893 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.140976 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.141124 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.142565 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 06:49:44.505251456 +0000 UTC Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.179527 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.179566 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.179547 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:10 crc kubenswrapper[4880]: E0218 11:52:10.179667 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:10 crc kubenswrapper[4880]: E0218 11:52:10.179734 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:10 crc kubenswrapper[4880]: E0218 11:52:10.179865 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.243715 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.243750 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.243760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.243775 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.243786 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.347387 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.347417 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.347426 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.347444 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.347453 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.449941 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.450023 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.450041 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.450056 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.450065 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.552444 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.553132 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.553159 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.553174 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.553184 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.656210 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.656250 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.656262 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.656276 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.656286 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.758777 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.758832 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.758850 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.758915 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.758940 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.861478 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.861533 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.861550 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.861573 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.861600 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.964958 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.965017 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.965038 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.965064 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.965082 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:10Z","lastTransitionTime":"2026-02-18T11:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:10 crc kubenswrapper[4880]: I0218 11:52:10.981692 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:10 crc kubenswrapper[4880]: E0218 11:52:10.981923 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:10 crc kubenswrapper[4880]: E0218 11:52:10.982046 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:14.982014465 +0000 UTC m=+42.410915356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.068651 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.068706 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.068727 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.068756 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.068777 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.143031 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 05:53:57.415746904 +0000 UTC Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.171532 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.171604 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.171654 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.171682 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.171700 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.178969 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:11 crc kubenswrapper[4880]: E0218 11:52:11.179110 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.274568 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.274714 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.274738 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.274770 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.274790 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.378040 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.378102 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.378114 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.378135 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.378155 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.481272 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.481345 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.481362 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.481382 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.481395 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.585131 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.585220 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.585247 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.585277 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.585297 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.688216 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.688255 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.688263 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.688276 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.688285 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.791670 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.792094 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.792209 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.792401 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.792524 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.895209 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.895305 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.895327 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.895353 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.895370 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.998785 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.999161 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.999272 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.999401 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:11 crc kubenswrapper[4880]: I0218 11:52:11.999534 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:11Z","lastTransitionTime":"2026-02-18T11:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.102178 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.102489 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.102596 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.102722 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.102819 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.144287 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 12:31:06.13803385 +0000 UTC Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.178773 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.178849 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:12 crc kubenswrapper[4880]: E0218 11:52:12.179055 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:12 crc kubenswrapper[4880]: E0218 11:52:12.179149 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.179826 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:12 crc kubenswrapper[4880]: E0218 11:52:12.180042 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.205722 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.205800 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.205817 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.205845 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.205862 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.309035 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.309106 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.309119 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.309137 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.309151 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.412054 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.412108 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.412123 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.412147 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.412158 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.513863 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.513903 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.513920 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.513941 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.513983 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.616245 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.616284 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.616296 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.616312 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.616325 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.718447 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.718480 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.718492 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.718505 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.718513 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.820220 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.820252 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.820261 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.820274 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.820284 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.923449 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.924062 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.924274 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.924479 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:12 crc kubenswrapper[4880]: I0218 11:52:12.924710 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:12Z","lastTransitionTime":"2026-02-18T11:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.027962 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.028004 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.028015 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.028033 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.028048 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.131174 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.131222 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.131236 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.131257 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.131268 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.144783 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 13:12:29.478606865 +0000 UTC Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.179343 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:13 crc kubenswrapper[4880]: E0218 11:52:13.179563 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.194472 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.205893 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.227526 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.234191 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.234412 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.234504 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.234599 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.234740 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.243007 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.266883 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e419ee722ab4eb9f7321b53d92ba1745971ae988b9f6e258c306f2f0958111e7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:04Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0218 11:52:04.634507 6141 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:04.634528 6141 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:04.634560 6141 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634661 6141 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0218 11:52:04.634676 6141 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.634915 6141 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635074 6141 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635090 6141 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635139 6141 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:04.635229 6141 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"message\\\":\\\"factory.go:141\\\\nI0218 11:52:06.909429 6305 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909547 6305 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909701 6305 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:06.910017 6305 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:52:06.910039 6305 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:52:06.910058 6305 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:06.910064 6305 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:06.910078 6305 factory.go:656] Stopping watch factory\\\\nI0218 11:52:06.910089 6305 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:52:06.910116 6305 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:06.910124 6305 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:52:06.910131 6305 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.277945 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.293141 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.304698 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.318767 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.329638 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.337507 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.337549 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.337561 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.337642 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.337658 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.347521 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.360054 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.384057 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.397900 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.419915 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.442511 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.442552 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.442564 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.442581 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.442592 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.443656 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.461182 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:13Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.545441 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.545516 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.545528 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.545549 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.545564 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.649334 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.649394 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.649409 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.649464 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.649479 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.752653 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.752723 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.752734 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.752751 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.752760 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.855984 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.856049 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.856061 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.856083 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.856094 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.959478 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.959521 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.959530 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.959547 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:13 crc kubenswrapper[4880]: I0218 11:52:13.959557 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:13Z","lastTransitionTime":"2026-02-18T11:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.063380 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.063426 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.063443 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.063463 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.063476 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.145780 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:21:50.687145742 +0000 UTC Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.167598 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.167739 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.167806 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.167840 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.167867 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.179377 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.179437 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.179495 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:14 crc kubenswrapper[4880]: E0218 11:52:14.179643 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:14 crc kubenswrapper[4880]: E0218 11:52:14.179814 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:14 crc kubenswrapper[4880]: E0218 11:52:14.180016 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.271631 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.271667 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.271677 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.271691 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.271722 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.374775 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.374853 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.374875 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.374909 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.374930 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.478950 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.479035 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.479061 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.479094 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.479111 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.583345 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.583430 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.583464 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.583495 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.583513 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.686676 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.686785 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.686840 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.686863 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.686881 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.792049 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.792111 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.792125 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.792152 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.792166 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.895480 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.895526 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.895538 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.895567 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.895586 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.998253 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.998301 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.998313 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.998330 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:14 crc kubenswrapper[4880]: I0218 11:52:14.998341 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:14Z","lastTransitionTime":"2026-02-18T11:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.031806 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:15 crc kubenswrapper[4880]: E0218 11:52:15.031871 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:15 crc kubenswrapper[4880]: E0218 11:52:15.031939 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:23.031922428 +0000 UTC m=+50.460823289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.101571 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.101640 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.101650 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.101665 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.101673 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.146742 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 23:25:14.382661805 +0000 UTC Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.179146 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:15 crc kubenswrapper[4880]: E0218 11:52:15.179279 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.204223 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.204274 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.204283 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.204300 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.204312 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.306473 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.306509 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.306518 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.306530 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.306540 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.408605 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.408686 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.408701 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.408721 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.408733 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.510978 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.511018 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.511029 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.511044 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.511055 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.613674 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.613726 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.613738 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.613755 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.613764 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.716571 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.716628 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.716646 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.716661 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.716674 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.821219 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.821292 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.821311 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.821343 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.821361 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.924975 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.925041 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.925058 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.925081 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:15 crc kubenswrapper[4880]: I0218 11:52:15.925096 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:15Z","lastTransitionTime":"2026-02-18T11:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.028628 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.028698 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.028718 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.028801 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.028820 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.132432 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.132491 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.132505 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.132533 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.132549 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.147162 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 10:54:35.66984509 +0000 UTC Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.179873 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.179915 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.180030 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:16 crc kubenswrapper[4880]: E0218 11:52:16.180124 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:16 crc kubenswrapper[4880]: E0218 11:52:16.180359 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:16 crc kubenswrapper[4880]: E0218 11:52:16.180504 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.235717 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.235766 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.235777 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.235791 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.235803 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.338632 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.338669 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.338679 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.338696 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.338706 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.441133 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.441201 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.441238 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.441266 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.441289 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.544363 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.544406 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.544417 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.544433 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.544443 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.646888 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.646918 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.646927 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.646940 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.646949 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.749684 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.749720 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.749729 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.749743 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.749751 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.852111 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.852176 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.852199 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.852227 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.852250 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.954329 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.954370 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.954380 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.954396 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:16 crc kubenswrapper[4880]: I0218 11:52:16.954405 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:16Z","lastTransitionTime":"2026-02-18T11:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.056314 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.056373 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.056390 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.056415 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.056427 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.132988 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.133900 4880 scope.go:117] "RemoveContainer" containerID="96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.146818 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.147359 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 22:00:12.444121571 +0000 UTC Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.159223 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.159258 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.159266 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.159281 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.159292 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.160372 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.170721 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.179043 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:17 crc kubenswrapper[4880]: E0218 11:52:17.179170 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.183923 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.194446 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.207478 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.225467 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.237110 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.249426 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.260975 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.262206 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.262228 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.262262 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.262280 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.262289 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.273960 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.291520 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"message\\\":\\\"factory.go:141\\\\nI0218 11:52:06.909429 6305 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909547 6305 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909701 6305 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:06.910017 6305 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:52:06.910039 6305 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:52:06.910058 6305 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:06.910064 6305 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:06.910078 6305 factory.go:656] Stopping watch factory\\\\nI0218 11:52:06.910089 6305 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:52:06.910116 6305 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:06.910124 6305 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:52:06.910131 6305 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.305884 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.318692 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.331091 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.340090 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.349893 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.364399 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.364425 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.364434 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.364448 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.364457 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.465888 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.465920 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.465928 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.465940 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.465948 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.470304 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/1.log" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.473102 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.473671 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.487668 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.498623 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.515857 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.536951 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.551329 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.563206 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.568101 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.568313 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.568407 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.568485 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.568578 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.573045 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.590043 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.601689 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.613422 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.623063 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.643081 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"message\\\":\\\"factory.go:141\\\\nI0218 11:52:06.909429 6305 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909547 6305 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909701 6305 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:06.910017 6305 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:52:06.910039 6305 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:52:06.910058 6305 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:06.910064 6305 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:06.910078 6305 factory.go:656] Stopping watch factory\\\\nI0218 11:52:06.910089 6305 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:52:06.910116 6305 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:06.910124 6305 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:52:06.910131 6305 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.653810 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.664353 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.670971 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.671169 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.671230 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.671294 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.671349 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.677863 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.691233 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.701758 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.773567 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.773788 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.773853 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.773959 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.774019 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.876743 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.876791 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.876800 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.876813 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.876822 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.979003 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.979042 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.979053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.979066 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:17 crc kubenswrapper[4880]: I0218 11:52:17.979075 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:17Z","lastTransitionTime":"2026-02-18T11:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.081340 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.081410 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.081422 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.081437 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.081445 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.148004 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:02:35.851624886 +0000 UTC Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.179646 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.179811 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.180030 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.180057 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.180205 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.180325 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.185555 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.185652 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.185751 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.185824 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.185836 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.288192 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.288228 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.288237 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.288250 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.288260 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.390630 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.390658 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.390669 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.390684 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.390694 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.477953 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/2.log" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.478712 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/1.log" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.482052 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad" exitCode=1 Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.482086 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.482118 4880 scope.go:117] "RemoveContainer" containerID="96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.482660 4880 scope.go:117] "RemoveContainer" containerID="674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad" Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.482800 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.492555 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.492596 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.492622 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.492642 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.492654 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.499513 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.517713 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.528504 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.541342 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.551263 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.567400 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96882108ee3d5480d29df3b08f6ab41ccde630cb9b9226c9db8e90f4a50dbeac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"message\\\":\\\"factory.go:141\\\\nI0218 11:52:06.909429 6305 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909547 6305 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0218 11:52:06.909701 6305 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:06.910017 6305 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0218 11:52:06.910039 6305 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0218 11:52:06.910058 6305 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:06.910064 6305 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:06.910078 6305 factory.go:656] Stopping watch factory\\\\nI0218 11:52:06.910089 6305 ovnkube.go:599] Stopped ovnkube\\\\nI0218 11:52:06.910116 6305 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0218 11:52:06.910124 6305 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0218 11:52:06.910131 6305 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.578696 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.588877 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.595179 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.595207 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.595218 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.595235 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.595245 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.597442 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.610499 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.622014 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.635965 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.645842 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.657882 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.669114 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.679784 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.690005 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.697673 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.697710 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.697719 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.697733 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.697743 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.800318 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.800361 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.800372 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.800388 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.800399 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.831053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.831098 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.831111 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.831127 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.831138 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.843309 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.846763 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.846801 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.846812 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.846826 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.846861 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.858241 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.861278 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.861308 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.861320 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.861334 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.861346 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.873815 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.877412 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.877462 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.877474 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.877494 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.877505 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.890423 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.894259 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.894308 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.894323 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.894340 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.894354 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.906360 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:18Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:18 crc kubenswrapper[4880]: E0218 11:52:18.906501 4880 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.908077 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.908115 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.908124 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.908143 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:18 crc kubenswrapper[4880]: I0218 11:52:18.908163 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:18Z","lastTransitionTime":"2026-02-18T11:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.012980 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.013043 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.013058 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.013079 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.013098 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.116181 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.116240 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.116253 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.116272 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.116297 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.148751 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 03:27:14.452518255 +0000 UTC Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.178816 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:19 crc kubenswrapper[4880]: E0218 11:52:19.179004 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.219969 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.220067 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.220400 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.220448 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.220462 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.322910 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.322954 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.322965 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.322979 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.322988 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.426212 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.426251 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.426260 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.426275 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.426284 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.486579 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/2.log" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.490461 4880 scope.go:117] "RemoveContainer" containerID="674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad" Feb 18 11:52:19 crc kubenswrapper[4880]: E0218 11:52:19.490658 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.504805 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.519013 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.528217 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.528250 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.528259 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.528291 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.528302 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.530599 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.549885 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.559561 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.576024 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.586784 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.597287 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.607010 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.616579 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.633288 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.633328 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.633339 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.633356 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.633368 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.649736 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.672667 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.684631 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.696839 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.708319 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.724374 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.736017 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.736052 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.736061 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.736080 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.736091 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.738600 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:19Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.838398 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.838433 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.838443 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.838456 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.838466 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.940690 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.940734 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.940748 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.940764 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:19 crc kubenswrapper[4880]: I0218 11:52:19.940774 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:19Z","lastTransitionTime":"2026-02-18T11:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.043820 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.043854 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.043864 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.043879 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.043888 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.147225 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.147269 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.147282 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.147296 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.147308 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.149334 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:21:44.208165464 +0000 UTC Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.178830 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.178851 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.178830 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:20 crc kubenswrapper[4880]: E0218 11:52:20.178930 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:20 crc kubenswrapper[4880]: E0218 11:52:20.179033 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:20 crc kubenswrapper[4880]: E0218 11:52:20.179115 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.249676 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.249714 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.249725 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.249741 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.249750 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.352448 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.352761 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.352885 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.353006 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.353120 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.456772 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.456816 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.456831 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.456851 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.456865 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.559647 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.559719 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.559737 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.559764 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.559778 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.662328 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.662369 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.662382 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.662400 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.662413 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.764934 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.764971 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.764980 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.764994 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.765004 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.867573 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.867633 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.867644 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.867659 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.867670 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.970383 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.970424 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.970437 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.970452 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:20 crc kubenswrapper[4880]: I0218 11:52:20.970462 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:20Z","lastTransitionTime":"2026-02-18T11:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.074023 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.074075 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.074086 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.074105 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.074119 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.150347 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 00:16:52.088110966 +0000 UTC Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.177822 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.177871 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.177882 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.177902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.177912 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.178965 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:21 crc kubenswrapper[4880]: E0218 11:52:21.179077 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.281204 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.281305 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.281333 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.281378 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.281404 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.384356 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.384416 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.384427 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.384444 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.384456 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.487959 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.488017 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.488035 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.488059 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.488075 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.590371 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.590420 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.590448 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.590472 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.590485 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.693274 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.693314 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.693324 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.693340 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.693350 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.796274 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.796320 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.796333 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.796351 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.796362 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.900033 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.900354 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.900458 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.900571 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:21 crc kubenswrapper[4880]: I0218 11:52:21.900696 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:21Z","lastTransitionTime":"2026-02-18T11:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.003911 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.003994 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.004038 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.004069 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.004089 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.107036 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.107073 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.107086 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.107101 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.107111 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.150956 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:20:06.497596343 +0000 UTC Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.179507 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.179576 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.179506 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:22 crc kubenswrapper[4880]: E0218 11:52:22.179697 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:22 crc kubenswrapper[4880]: E0218 11:52:22.179812 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:22 crc kubenswrapper[4880]: E0218 11:52:22.179877 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.208922 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.208963 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.208974 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.208986 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.208998 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.312087 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.312125 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.312137 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.312156 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.312167 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.414413 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.414454 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.414465 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.414496 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.414509 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.516263 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.516311 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.516321 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.516339 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.516355 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.618926 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.618972 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.618987 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.619007 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.619018 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.721795 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.721985 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.722024 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.722050 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.722066 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.824426 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.824479 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.824493 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.824510 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.824792 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.926861 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.926899 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.926908 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.926922 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:22 crc kubenswrapper[4880]: I0218 11:52:22.926931 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:22Z","lastTransitionTime":"2026-02-18T11:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.029808 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.029866 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.029883 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.029902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.029915 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.115660 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.115818 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.115910 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:52:39.115891764 +0000 UTC m=+66.544792625 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.132371 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.132407 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.132417 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.132433 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.132442 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.151652 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 06:12:28.80504627 +0000 UTC Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.179041 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.179164 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.192560 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.208425 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.220412 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.230705 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.233966 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.234093 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.234165 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.234229 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.234293 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.242195 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.251981 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.271801 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.286693 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.301250 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.311121 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.321282 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.336515 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.336555 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.336567 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.336585 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.336596 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.336935 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.345977 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.357083 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.368299 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.378244 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.388505 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:23Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.438843 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.439103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.439115 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.439130 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.439140 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.541063 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.541100 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.541111 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.541126 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.541136 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.643535 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.643584 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.643594 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.643627 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.643639 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.745869 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.745896 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.745905 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.745919 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.745926 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.847907 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.847948 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.847958 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.847971 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.847979 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.922737 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.922836 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:52:55.922817359 +0000 UTC m=+83.351718210 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.922897 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.922949 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.923054 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.923089 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:55.923081936 +0000 UTC m=+83.351982797 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.923229 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:52:23 crc kubenswrapper[4880]: E0218 11:52:23.923334 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:55.923324642 +0000 UTC m=+83.352225503 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.950208 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.950236 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.950246 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.950260 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:23 crc kubenswrapper[4880]: I0218 11:52:23.950269 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:23Z","lastTransitionTime":"2026-02-18T11:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.024141 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.024184 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024328 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024343 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024352 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024387 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:56.024376346 +0000 UTC m=+83.453277207 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024427 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024436 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024443 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.024460 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:52:56.024455078 +0000 UTC m=+83.453355939 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.052757 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.052793 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.052802 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.052815 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.052823 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.152334 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 02:25:33.160230317 +0000 UTC Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.154734 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.154769 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.154781 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.154797 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.154808 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.179204 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.179311 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.179213 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.179204 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.179382 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:24 crc kubenswrapper[4880]: E0218 11:52:24.179537 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.257030 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.257058 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.257067 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.257080 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.257088 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.359160 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.359201 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.359214 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.359239 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.359254 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.462024 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.462065 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.462075 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.462092 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.462104 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.564952 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.564986 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.564996 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.565013 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.565024 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.666882 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.666926 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.666947 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.666965 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.666976 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.769675 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.769719 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.769732 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.769747 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.769758 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.872450 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.872496 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.872509 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.872524 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.872533 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.974524 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.974570 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.974587 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.974619 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:24 crc kubenswrapper[4880]: I0218 11:52:24.974633 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:24Z","lastTransitionTime":"2026-02-18T11:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.077589 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.077651 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.077663 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.077681 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.077693 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.153510 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 19:04:20.916117778 +0000 UTC Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.179226 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:25 crc kubenswrapper[4880]: E0218 11:52:25.179626 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.180409 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.180462 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.180478 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.180495 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.180505 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.282495 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.282524 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.282534 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.282547 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.282555 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.385559 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.385597 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.385666 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.385682 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.385694 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.488292 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.488324 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.488333 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.488346 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.488354 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.591167 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.591194 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.591202 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.591216 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.591227 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.693266 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.693335 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.693348 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.693365 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.693374 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.795258 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.795303 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.795313 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.795329 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.795341 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.900225 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.900279 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.900293 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.900336 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:25 crc kubenswrapper[4880]: I0218 11:52:25.900355 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:25Z","lastTransitionTime":"2026-02-18T11:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.002862 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.002902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.002916 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.002934 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.002948 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.106683 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.106760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.106779 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.106797 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.106811 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.153927 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 01:28:16.436614758 +0000 UTC Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.179300 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.179340 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:26 crc kubenswrapper[4880]: E0218 11:52:26.179437 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:26 crc kubenswrapper[4880]: E0218 11:52:26.179498 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.179828 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:26 crc kubenswrapper[4880]: E0218 11:52:26.180039 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.210423 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.210772 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.210957 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.211088 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.211240 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.314083 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.314131 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.314143 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.314161 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.314173 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.416412 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.416447 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.416459 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.416476 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.416490 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.518093 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.518130 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.518141 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.518155 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.518164 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.620836 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.620898 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.620914 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.620932 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.620944 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.723241 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.723266 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.723275 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.723288 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.723297 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.826138 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.826171 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.826181 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.826198 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.826210 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.928341 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.928369 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.928378 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.928391 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:26 crc kubenswrapper[4880]: I0218 11:52:26.928400 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:26Z","lastTransitionTime":"2026-02-18T11:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.030781 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.030848 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.030859 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.030876 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.030886 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.133001 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.133027 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.133035 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.133048 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.133057 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.154494 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 02:10:34.07621005 +0000 UTC Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.178857 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:27 crc kubenswrapper[4880]: E0218 11:52:27.179374 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.235490 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.235525 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.235535 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.235550 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.235562 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.338158 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.338765 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.338789 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.338805 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.338814 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.412123 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.420926 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.424441 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.435367 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.440681 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.440727 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.440739 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.440758 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.440770 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.448682 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.463235 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.473511 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.485124 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.497944 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.510239 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.522478 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.535326 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.542874 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.542932 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.542945 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.542964 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.542976 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.547512 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.567862 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.579511 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.590125 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.607419 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.616979 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.634371 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:27Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.646172 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.646282 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.646303 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.646329 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.646346 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.748635 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.748664 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.748673 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.748687 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.748696 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.851758 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.851791 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.851800 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.851812 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.851821 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.954298 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.954337 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.954378 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.954393 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:27 crc kubenswrapper[4880]: I0218 11:52:27.954403 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:27Z","lastTransitionTime":"2026-02-18T11:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.056721 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.056764 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.056776 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.056794 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.056807 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.154758 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:59:04.656773002 +0000 UTC Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.160884 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.160914 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.160924 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.160937 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.160945 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.179574 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.179633 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.179688 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:28 crc kubenswrapper[4880]: E0218 11:52:28.179705 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:28 crc kubenswrapper[4880]: E0218 11:52:28.179783 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:28 crc kubenswrapper[4880]: E0218 11:52:28.179870 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.263005 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.263042 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.263053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.263068 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.263078 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.364545 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.364589 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.364602 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.364636 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.364655 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.467249 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.467281 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.467290 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.467307 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.467316 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.570072 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.570124 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.570136 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.570152 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.570163 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.671940 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.671973 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.671981 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.671993 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.672001 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.774214 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.774262 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.774274 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.774292 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.774303 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.876563 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.876598 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.876624 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.876639 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.876649 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.979137 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.979189 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.979200 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.979215 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:28 crc kubenswrapper[4880]: I0218 11:52:28.979225 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:28Z","lastTransitionTime":"2026-02-18T11:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.081571 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.081638 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.081648 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.081663 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.081671 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.155749 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 19:01:30.655885666 +0000 UTC Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.179514 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:29 crc kubenswrapper[4880]: E0218 11:52:29.179702 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.204518 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.204587 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.204644 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.204676 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.204699 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.238602 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.238684 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.238700 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.238724 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.238741 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: E0218 11:52:29.258193 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.263060 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.263115 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.263134 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.263194 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.263217 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: E0218 11:52:29.282198 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.286252 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.286301 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.286322 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.286346 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.286364 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: E0218 11:52:29.304424 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.310030 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.310080 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.310099 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.310118 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.310133 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: E0218 11:52:29.323120 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.330864 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.330931 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.330943 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.330959 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.330972 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: E0218 11:52:29.345739 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:29Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:29 crc kubenswrapper[4880]: E0218 11:52:29.345900 4880 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.347314 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.347348 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.347359 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.347375 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.347385 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.449599 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.449697 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.449709 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.449790 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.450311 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.552539 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.552573 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.552581 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.552595 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.552619 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.654987 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.655028 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.655041 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.655058 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.655071 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.757954 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.758003 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.758014 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.758031 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.758045 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.860624 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.860680 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.860700 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.860719 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.860733 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.963017 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.963055 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.963064 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.963078 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:29 crc kubenswrapper[4880]: I0218 11:52:29.963087 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:29Z","lastTransitionTime":"2026-02-18T11:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.065461 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.065517 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.065529 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.065546 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.065560 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.156930 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 01:01:30.902405386 +0000 UTC Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.167971 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.168010 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.168019 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.168034 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.168041 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.179353 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.179377 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:30 crc kubenswrapper[4880]: E0218 11:52:30.179471 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.179502 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:30 crc kubenswrapper[4880]: E0218 11:52:30.179680 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:30 crc kubenswrapper[4880]: E0218 11:52:30.179931 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.271085 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.271118 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.271127 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.271139 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.271148 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.374827 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.374904 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.374926 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.374953 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.374974 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.476913 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.476969 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.476987 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.477009 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.477026 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.579000 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.579100 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.579115 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.579133 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.579148 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.681334 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.681372 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.681381 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.681395 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.681406 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.783716 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.783758 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.783769 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.783786 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.783795 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.886879 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.886932 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.886941 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.886956 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.886965 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.989264 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.989315 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.989328 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.989345 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:30 crc kubenswrapper[4880]: I0218 11:52:30.989355 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:30Z","lastTransitionTime":"2026-02-18T11:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.092406 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.092482 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.092502 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.092525 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.092543 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.157692 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 16:38:28.051918775 +0000 UTC Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.179594 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:31 crc kubenswrapper[4880]: E0218 11:52:31.179874 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.194816 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.194877 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.194898 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.194923 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.194939 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.298664 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.298739 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.298765 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.298790 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.298807 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.401861 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.401914 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.401925 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.401941 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.401952 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.504760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.504857 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.504874 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.504900 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.504916 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.608885 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.608941 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.608958 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.608980 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.608996 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.710908 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.711000 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.711022 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.711047 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.711063 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.813871 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.814185 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.814277 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.814369 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.814450 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.917349 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.917398 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.917411 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.917428 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:31 crc kubenswrapper[4880]: I0218 11:52:31.917442 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:31Z","lastTransitionTime":"2026-02-18T11:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.020053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.020101 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.020114 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.020132 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.020143 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.122894 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.122942 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.122954 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.122970 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.122980 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.158495 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 19:54:12.554687817 +0000 UTC Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.179251 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.179414 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:32 crc kubenswrapper[4880]: E0218 11:52:32.179698 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.180041 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:32 crc kubenswrapper[4880]: E0218 11:52:32.180257 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:32 crc kubenswrapper[4880]: E0218 11:52:32.180461 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.225589 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.225707 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.225733 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.225760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.225781 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.328479 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.328532 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.328545 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.328562 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.328574 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.431577 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.431690 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.431725 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.431753 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.431770 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.533860 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.533897 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.533907 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.533922 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.533931 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.637041 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.637090 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.637100 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.637114 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.637123 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.739248 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.739302 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.739320 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.739344 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.739361 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.841693 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.841760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.841782 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.841802 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.841815 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.945457 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.945507 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.945517 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.945533 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:32 crc kubenswrapper[4880]: I0218 11:52:32.945543 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:32Z","lastTransitionTime":"2026-02-18T11:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.048126 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.048199 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.048213 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.048236 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.048256 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.152119 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.152184 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.152199 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.152222 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.152237 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.159588 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 19:00:06.012187877 +0000 UTC Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.178937 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:33 crc kubenswrapper[4880]: E0218 11:52:33.179185 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.180571 4880 scope.go:117] "RemoveContainer" containerID="674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad" Feb 18 11:52:33 crc kubenswrapper[4880]: E0218 11:52:33.181106 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.201877 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.215911 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.241896 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.255480 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.255517 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.255530 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.255550 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.255563 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.261387 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.278767 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.299782 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.315872 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.329901 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.351981 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.357588 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.357655 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.357669 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.357689 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.357701 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.368471 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.381505 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.394391 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.408349 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.423277 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.446794 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.460988 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.461024 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.461036 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.461053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.461065 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.462765 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.475317 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.486194 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:33Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.563124 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.563158 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.563169 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.563184 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.563193 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.665453 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.665496 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.665508 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.665525 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.665539 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.768216 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.768271 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.768284 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.768303 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.768314 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.870316 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.870373 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.870390 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.870410 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.870423 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.973236 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.973577 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.973595 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.973639 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:33 crc kubenswrapper[4880]: I0218 11:52:33.973655 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:33Z","lastTransitionTime":"2026-02-18T11:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.075829 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.075875 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.075885 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.075902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.075913 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.159998 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 14:31:50.771572086 +0000 UTC Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.178003 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.178038 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.178048 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.178061 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.178070 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.179557 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.179661 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:34 crc kubenswrapper[4880]: E0218 11:52:34.179686 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.179552 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:34 crc kubenswrapper[4880]: E0218 11:52:34.179800 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:34 crc kubenswrapper[4880]: E0218 11:52:34.179858 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.280184 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.280217 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.280229 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.280265 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.280278 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.382324 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.382362 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.382370 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.382385 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.382394 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.484836 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.484881 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.484893 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.484909 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.484920 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.587098 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.587173 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.587186 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.587203 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.587215 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.689153 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.689199 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.689211 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.689226 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.689237 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.791362 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.791410 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.791426 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.791449 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.791464 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.893880 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.893919 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.893931 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.893948 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.893960 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.995798 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.995871 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.995881 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.995895 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:34 crc kubenswrapper[4880]: I0218 11:52:34.995905 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:34Z","lastTransitionTime":"2026-02-18T11:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.098391 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.098439 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.098454 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.098474 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.098488 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.160146 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 00:52:14.069872645 +0000 UTC Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.179539 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:35 crc kubenswrapper[4880]: E0218 11:52:35.179715 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.200870 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.200911 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.200921 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.200934 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.200943 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.303549 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.303601 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.303631 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.303660 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.303673 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.405782 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.405828 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.405840 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.405858 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.405870 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.509055 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.509097 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.509108 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.509123 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.509133 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.611272 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.611318 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.611330 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.611346 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.611357 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.714055 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.714111 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.714134 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.714169 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.714184 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.816184 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.816224 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.816235 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.816250 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.816263 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.918826 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.918870 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.918879 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.918894 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:35 crc kubenswrapper[4880]: I0218 11:52:35.918904 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:35Z","lastTransitionTime":"2026-02-18T11:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.021596 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.021655 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.021667 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.021684 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.021696 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.124134 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.124168 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.124177 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.124190 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.124200 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.161025 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 18:03:35.569248081 +0000 UTC Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.179403 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.179420 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.179481 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:36 crc kubenswrapper[4880]: E0218 11:52:36.179521 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:36 crc kubenswrapper[4880]: E0218 11:52:36.179627 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:36 crc kubenswrapper[4880]: E0218 11:52:36.179720 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.226443 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.226487 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.226498 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.226512 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.226523 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.328321 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.328366 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.328389 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.328408 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.328419 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.430899 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.430941 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.430953 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.430970 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.430981 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.533522 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.533563 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.533574 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.533589 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.533600 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.636105 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.636143 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.636152 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.636166 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.636174 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.739026 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.739088 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.739103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.739121 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.739131 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.841662 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.841716 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.841730 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.841749 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.841761 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.944013 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.944103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.944116 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.944129 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:36 crc kubenswrapper[4880]: I0218 11:52:36.944138 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:36Z","lastTransitionTime":"2026-02-18T11:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.045848 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.045887 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.045896 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.045909 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.045945 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.147689 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.147743 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.147756 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.147779 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.147794 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.162222 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 05:43:49.753615044 +0000 UTC Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.179587 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:37 crc kubenswrapper[4880]: E0218 11:52:37.179711 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.250675 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.250712 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.250721 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.250734 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.250743 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.354355 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.354445 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.354459 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.354476 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.354490 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.458052 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.458130 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.458157 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.458186 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.458208 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.560740 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.560768 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.560777 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.560791 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.560799 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.662735 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.662781 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.662791 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.662826 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.662835 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.765088 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.765133 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.765144 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.765161 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.765171 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.867039 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.867089 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.867102 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.867119 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.867129 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.968929 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.968972 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.968985 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.969003 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:37 crc kubenswrapper[4880]: I0218 11:52:37.969015 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:37Z","lastTransitionTime":"2026-02-18T11:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.070899 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.070947 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.070957 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.070972 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.070984 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.163032 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 04:27:59.500510648 +0000 UTC Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.173085 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.173125 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.173136 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.173152 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.173164 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.179431 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.179461 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.179438 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:38 crc kubenswrapper[4880]: E0218 11:52:38.179555 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:38 crc kubenswrapper[4880]: E0218 11:52:38.179639 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:38 crc kubenswrapper[4880]: E0218 11:52:38.179738 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.275569 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.275639 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.275650 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.275665 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.275680 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.379406 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.379463 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.379474 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.379492 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.379506 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.481948 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.481997 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.482006 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.482022 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.482031 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.583598 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.583656 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.583668 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.583687 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.583698 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.685855 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.685960 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.685988 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.686022 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.686048 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.788438 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.788487 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.788511 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.788531 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.788543 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.891103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.891174 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.891207 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.891235 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.891253 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.993996 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.994048 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.994060 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.994076 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:38 crc kubenswrapper[4880]: I0218 11:52:38.994090 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:38Z","lastTransitionTime":"2026-02-18T11:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.097772 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.097845 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.097865 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.097896 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.097918 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.163641 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 00:34:26.794403351 +0000 UTC Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.178279 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.178444 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.178502 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:53:11.178488296 +0000 UTC m=+98.607389157 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.178714 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.178814 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.200280 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.200591 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.200729 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.200842 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.200972 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.303347 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.303598 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.303698 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.303773 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.303850 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.407201 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.407449 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.407569 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.407694 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.407790 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.510410 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.510452 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.510465 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.510483 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.510498 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.613442 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.613488 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.613497 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.613513 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.613524 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.630497 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.630551 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.630567 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.630583 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.630592 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.643578 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.647826 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.647871 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.647882 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.647898 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.647908 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.663740 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.668871 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.668907 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.668916 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.668931 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.668942 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.686646 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.690630 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.690746 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.690816 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.690892 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.690963 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.701437 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.709552 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.709626 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.709640 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.709657 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.709670 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.722827 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:39Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:39 crc kubenswrapper[4880]: E0218 11:52:39.722942 4880 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.724387 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.724410 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.724419 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.724432 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.724441 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.827016 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.827053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.827062 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.827076 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.827086 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.929656 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.929729 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.929745 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.929766 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:39 crc kubenswrapper[4880]: I0218 11:52:39.929780 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:39Z","lastTransitionTime":"2026-02-18T11:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.031911 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.032190 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.032207 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.032220 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.032232 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.134235 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.134292 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.134304 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.134324 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.134337 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.164598 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 21:47:45.595550956 +0000 UTC Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.178874 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.178917 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:40 crc kubenswrapper[4880]: E0218 11:52:40.178991 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:40 crc kubenswrapper[4880]: E0218 11:52:40.179056 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.179240 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:40 crc kubenswrapper[4880]: E0218 11:52:40.179361 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.236635 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.236667 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.236675 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.236687 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.236695 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.339327 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.339393 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.339405 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.339422 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.339433 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.441395 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.441641 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.441717 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.441806 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.441873 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.544325 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.544898 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.544995 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.545089 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.545170 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.647197 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.647238 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.647247 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.647265 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.647275 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.750308 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.750353 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.750365 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.750384 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.750395 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.853987 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.854041 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.854053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.854073 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.854090 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.956378 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.956678 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.956787 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.956861 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:40 crc kubenswrapper[4880]: I0218 11:52:40.956923 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:40Z","lastTransitionTime":"2026-02-18T11:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.060290 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.060341 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.060355 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.060374 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.060389 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.162682 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.162743 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.162755 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.162770 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.162779 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.165092 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 11:03:23.180214155 +0000 UTC Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.179447 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:41 crc kubenswrapper[4880]: E0218 11:52:41.179960 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.265825 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.265866 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.265874 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.265893 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.265903 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.368121 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.368198 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.368224 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.368259 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.368282 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.470860 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.470906 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.470922 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.470942 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.470954 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.573229 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.573276 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.573290 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.573307 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.573318 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.675823 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.675867 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.675882 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.675902 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.675917 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.779326 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.779394 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.779408 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.779432 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.779447 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.882351 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.882417 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.882433 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.882457 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.882471 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.985340 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.985401 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.985415 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.985438 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:41 crc kubenswrapper[4880]: I0218 11:52:41.985456 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:41Z","lastTransitionTime":"2026-02-18T11:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.088650 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.088731 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.088743 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.088762 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.088775 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.165724 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 22:39:16.739286055 +0000 UTC Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.179209 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.179221 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.179233 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:42 crc kubenswrapper[4880]: E0218 11:52:42.179487 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:42 crc kubenswrapper[4880]: E0218 11:52:42.179594 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:42 crc kubenswrapper[4880]: E0218 11:52:42.179694 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.191970 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.192019 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.192031 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.192087 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.192101 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.295003 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.295077 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.295090 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.295113 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.295127 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.398192 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.398258 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.398269 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.398286 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.398297 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.501776 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.501837 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.501850 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.501872 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.501884 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.557962 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/0.log" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.558026 4880 generic.go:334] "Generic (PLEG): container finished" podID="3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8" containerID="4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d" exitCode=1 Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.558053 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerDied","Data":"4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.558384 4880 scope.go:117] "RemoveContainer" containerID="4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.569716 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.591400 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.603708 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.604960 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.605003 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.605014 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.605030 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.605041 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.615652 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.629673 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.642645 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.665291 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.682333 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.696455 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.707488 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.707534 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.707545 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.707562 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.707574 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.710112 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.724652 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.740746 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.756801 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:41Z\\\",\\\"message\\\":\\\"2026-02-18T11:51:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9\\\\n2026-02-18T11:51:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9 to /host/opt/cni/bin/\\\\n2026-02-18T11:51:56Z [verbose] multus-daemon started\\\\n2026-02-18T11:51:56Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:52:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.770132 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.786917 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.802291 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.809464 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.809492 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.809502 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.809514 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.809524 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.815675 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.827627 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:42Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.911690 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.911717 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.911726 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.911739 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:42 crc kubenswrapper[4880]: I0218 11:52:42.911747 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:42Z","lastTransitionTime":"2026-02-18T11:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.013571 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.013621 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.013630 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.013645 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.013654 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.116069 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.116103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.116111 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.116124 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.116132 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.166367 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 10:39:41.53274202 +0000 UTC Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.178988 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:43 crc kubenswrapper[4880]: E0218 11:52:43.179155 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.190661 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.201396 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.219329 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.219715 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.219735 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.219745 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.219760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.219770 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.231830 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.244240 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.256848 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.267398 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.281026 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.292330 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.304583 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.315129 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.325322 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.325356 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.325368 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.325384 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.325395 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.328938 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:41Z\\\",\\\"message\\\":\\\"2026-02-18T11:51:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9\\\\n2026-02-18T11:51:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9 to /host/opt/cni/bin/\\\\n2026-02-18T11:51:56Z [verbose] multus-daemon started\\\\n2026-02-18T11:51:56Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:52:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.339959 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.351652 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.370422 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.382770 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.395983 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.405099 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.427648 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.427687 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.427699 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.427716 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.427727 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.530420 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.530463 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.530474 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.530489 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.530499 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.562380 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/0.log" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.562433 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerStarted","Data":"6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.576363 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.588508 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:41Z\\\",\\\"message\\\":\\\"2026-02-18T11:51:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9\\\\n2026-02-18T11:51:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9 to /host/opt/cni/bin/\\\\n2026-02-18T11:51:56Z [verbose] multus-daemon started\\\\n2026-02-18T11:51:56Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:52:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.599304 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.612550 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.624305 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.632669 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.632708 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.632721 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.632738 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.632751 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.635678 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.646575 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.655896 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.673166 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.683291 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.699563 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.709083 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.717590 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.729502 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.737183 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.737220 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.737232 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.737246 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.737254 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.741268 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.752755 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.762423 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.776232 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:43Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.839847 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.839895 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.839904 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.839921 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.839930 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.941981 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.942026 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.942037 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.942053 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:43 crc kubenswrapper[4880]: I0218 11:52:43.942063 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:43Z","lastTransitionTime":"2026-02-18T11:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.044302 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.044332 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.044341 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.044354 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.044363 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.147011 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.147057 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.147068 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.147084 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.147096 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.166644 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 12:06:35.30436319 +0000 UTC Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.178803 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.178853 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.178813 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:44 crc kubenswrapper[4880]: E0218 11:52:44.178897 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:44 crc kubenswrapper[4880]: E0218 11:52:44.178973 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:44 crc kubenswrapper[4880]: E0218 11:52:44.179100 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.249679 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.249742 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.249755 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.249771 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.249789 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.351915 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.351953 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.351963 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.351979 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.351989 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.454220 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.454251 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.454260 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.454273 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.454281 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.556715 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.556762 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.556775 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.556793 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.556804 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.659876 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.659927 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.659942 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.659962 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.659976 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.762280 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.762693 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.762851 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.762997 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.763128 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.866123 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.866222 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.866233 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.866250 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.866276 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.968340 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.968382 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.968397 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.968412 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:44 crc kubenswrapper[4880]: I0218 11:52:44.968422 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:44Z","lastTransitionTime":"2026-02-18T11:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.070719 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.070765 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.070777 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.070795 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.070806 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.167801 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:55:05.967300409 +0000 UTC Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.173660 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.173737 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.173747 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.173762 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.173771 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.179126 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:45 crc kubenswrapper[4880]: E0218 11:52:45.179445 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.192409 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.276046 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.276078 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.276087 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.276100 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.276110 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.378673 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.378734 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.378750 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.378773 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.378789 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.481574 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.481632 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.481643 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.481659 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.481672 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.584005 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.584039 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.584047 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.584060 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.584069 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.685972 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.686010 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.686021 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.686035 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.686044 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.788356 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.788387 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.788396 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.788427 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.788436 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.890779 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.890810 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.890820 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.890832 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.890842 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.993526 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.993567 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.993579 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.993636 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:45 crc kubenswrapper[4880]: I0218 11:52:45.993647 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:45Z","lastTransitionTime":"2026-02-18T11:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.096324 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.096351 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.096361 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.096373 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.096381 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.168250 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 20:18:22.284317103 +0000 UTC Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.179672 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.179685 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.180004 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:46 crc kubenswrapper[4880]: E0218 11:52:46.180334 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:46 crc kubenswrapper[4880]: E0218 11:52:46.180809 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:46 crc kubenswrapper[4880]: E0218 11:52:46.180988 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.181023 4880 scope.go:117] "RemoveContainer" containerID="674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.198510 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.198545 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.198553 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.198566 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.198575 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.302261 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.302303 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.302315 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.302330 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.302340 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.405019 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.405059 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.405072 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.405088 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.405099 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.507350 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.507389 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.507399 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.507413 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.507423 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.573230 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/2.log" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.575599 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.575944 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.596489 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.611305 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.611425 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.611441 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.611464 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.611476 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.612771 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.626134 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.639092 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.652214 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.667361 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.692914 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.705110 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c79fb0e-fda9-432a-aa44-7d013393deee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f59e6228c0008882a58e27dda27703242aa49db4db2d03be0cbe9fe055567e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.713952 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.714000 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.714011 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.714025 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.714035 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.717452 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.740600 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.753818 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.766132 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.777683 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.791599 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.805411 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.816598 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.816648 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.816658 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.816677 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.816688 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.819923 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.839844 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:41Z\\\",\\\"message\\\":\\\"2026-02-18T11:51:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9\\\\n2026-02-18T11:51:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9 to /host/opt/cni/bin/\\\\n2026-02-18T11:51:56Z [verbose] multus-daemon started\\\\n2026-02-18T11:51:56Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:52:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.853337 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.870878 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:46Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.919462 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.919509 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.919525 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.919548 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:46 crc kubenswrapper[4880]: I0218 11:52:46.919563 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:46Z","lastTransitionTime":"2026-02-18T11:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.022823 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.022923 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.022958 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.022991 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.023011 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.126487 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.127062 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.127078 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.127101 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.127117 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.169221 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 17:58:44.777531851 +0000 UTC Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.179670 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:47 crc kubenswrapper[4880]: E0218 11:52:47.179819 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.229319 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.229384 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.229400 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.229425 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.229437 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.332263 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.332334 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.332355 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.332384 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.332400 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.436344 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.436438 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.436465 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.436503 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.436529 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.540312 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.540362 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.540372 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.540391 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.540405 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.579766 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/3.log" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.580475 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/2.log" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.584456 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" exitCode=1 Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.584527 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.584601 4880 scope.go:117] "RemoveContainer" containerID="674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.585181 4880 scope.go:117] "RemoveContainer" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" Feb 18 11:52:47 crc kubenswrapper[4880]: E0218 11:52:47.585321 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.597225 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c79fb0e-fda9-432a-aa44-7d013393deee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f59e6228c0008882a58e27dda27703242aa49db4db2d03be0cbe9fe055567e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.610546 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.623124 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.634210 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.644115 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.644147 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.644156 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.644168 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.644179 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.646158 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.657337 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.671758 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.687209 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.698639 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.710592 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:41Z\\\",\\\"message\\\":\\\"2026-02-18T11:51:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9\\\\n2026-02-18T11:51:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9 to /host/opt/cni/bin/\\\\n2026-02-18T11:51:56Z [verbose] multus-daemon started\\\\n2026-02-18T11:51:56Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:52:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.720837 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.733138 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.749432 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.749464 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.749472 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.749486 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.749494 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.749703 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.759582 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.769845 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.778201 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.788858 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.808864 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.831349 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://674daca3e4e2a783b572f53b981e2b821d6a8b60d84ff3b982483a807d9308ad\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:17Z\\\",\\\"message\\\":\\\"8 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0218 11:52:17.957161 6498 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:17Z is after 2025-08-24T17:21:41Z]\\\\nI0218 11:52:17.957174 6498 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0218 11:52:17.957181 6498 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0218 11:52:17.957189 6498 default_network_controller.go:776] Recording success event on pod opens\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:47Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130149 6912 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130265 6912 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130671 6912 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.131341 6912 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:47.131366 6912 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:47.131384 6912 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:52:47.131399 6912 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:52:47.131401 6912 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:52:47.131406 6912 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:52:47.131418 6912 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:52:47.131427 6912 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:52:47.131432 6912 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:47.131447 6912 factory.go:656] Stopping watch factory\\\\nI0218 11:52:47.131465 6912 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:47Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.852435 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.852489 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.852499 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.852513 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.852523 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.955760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.955790 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.955799 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.955811 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:47 crc kubenswrapper[4880]: I0218 11:52:47.955819 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:47Z","lastTransitionTime":"2026-02-18T11:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.058004 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.058038 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.058051 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.058067 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.058079 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.160851 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.160879 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.160888 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.160900 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.160908 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.170036 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 10:26:08.638299906 +0000 UTC Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.179512 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.179512 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:48 crc kubenswrapper[4880]: E0218 11:52:48.179677 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:48 crc kubenswrapper[4880]: E0218 11:52:48.179718 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.179535 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:48 crc kubenswrapper[4880]: E0218 11:52:48.179790 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.263185 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.263215 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.263226 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.263240 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.263249 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.365533 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.365855 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.366005 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.366095 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.366176 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.468551 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.468588 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.468637 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.468654 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.468663 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.571327 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.571632 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.571703 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.571807 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.571877 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.588597 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/3.log" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.591670 4880 scope.go:117] "RemoveContainer" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" Feb 18 11:52:48 crc kubenswrapper[4880]: E0218 11:52:48.591801 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.604387 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c79fb0e-fda9-432a-aa44-7d013393deee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f59e6228c0008882a58e27dda27703242aa49db4db2d03be0cbe9fe055567e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.616890 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.628158 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.642682 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.655190 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.663669 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.673890 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.673920 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.673930 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.673942 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.673951 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.697381 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.723684 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.737367 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.750481 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:41Z\\\",\\\"message\\\":\\\"2026-02-18T11:51:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9\\\\n2026-02-18T11:51:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9 to /host/opt/cni/bin/\\\\n2026-02-18T11:51:56Z [verbose] multus-daemon started\\\\n2026-02-18T11:51:56Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:52:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.763014 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.775933 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.775970 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.775979 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.775996 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.776004 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.777731 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.798022 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.809758 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.820622 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.829420 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.844123 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.855512 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.873669 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:47Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130149 6912 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130265 6912 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130671 6912 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.131341 6912 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:47.131366 6912 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:47.131384 6912 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:52:47.131399 6912 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:52:47.131401 6912 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:52:47.131406 6912 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:52:47.131418 6912 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:52:47.131427 6912 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:52:47.131432 6912 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:47.131447 6912 factory.go:656] Stopping watch factory\\\\nI0218 11:52:47.131465 6912 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:48Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.878072 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.878094 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.878106 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.878120 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.878130 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.981167 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.981204 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.981216 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.981232 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:48 crc kubenswrapper[4880]: I0218 11:52:48.981242 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:48Z","lastTransitionTime":"2026-02-18T11:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.083029 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.083076 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.083089 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.083105 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.083118 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.171165 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 07:27:34.267616074 +0000 UTC Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.178809 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:49 crc kubenswrapper[4880]: E0218 11:52:49.179008 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.185328 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.186022 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.186059 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.186076 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.186090 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.288656 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.289372 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.289544 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.289782 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.289825 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.393061 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.393532 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.393778 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.393947 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.394113 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.497816 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.497865 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.497877 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.497896 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.497908 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.600948 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.601001 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.601014 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.601033 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.601049 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.703328 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.703371 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.703382 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.703397 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.703408 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.805929 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.805987 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.806003 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.806027 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.806044 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.908904 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.908989 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.909018 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.909049 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:49 crc kubenswrapper[4880]: I0218 11:52:49.909074 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:49Z","lastTransitionTime":"2026-02-18T11:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.011291 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.011330 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.011341 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.011356 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.011367 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.094491 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.094542 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.094553 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.094584 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.094596 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.114361 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:50Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.121091 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.121136 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.121149 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.121170 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.121183 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.140804 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:50Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.145250 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.145328 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.145358 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.145391 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.145416 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.158925 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:50Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.163881 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.163928 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.163940 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.163960 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.163973 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.171715 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:48:58.939393443 +0000 UTC Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.179271 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.179327 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.179326 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.179495 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.179654 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.179805 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.183498 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:50Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.187783 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.187822 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.187833 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.187849 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.187864 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.201715 4880 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee442dc6-a62b-4ceb-ac1d-9c063b3e0551\\\",\\\"systemUUID\\\":\\\"f0934b39-096d-4ca6-8108-e77503086d3c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:50Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:50 crc kubenswrapper[4880]: E0218 11:52:50.201838 4880 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.203513 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.203554 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.203587 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.203669 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.203681 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.307453 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.307556 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.307570 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.307588 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.307599 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.411207 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.411291 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.411318 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.411351 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.411373 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.515583 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.515704 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.515734 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.515765 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.515784 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.620904 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.620984 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.620999 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.621023 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.621038 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.726010 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.726048 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.726057 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.726072 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.726082 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.828698 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.828764 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.828782 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.828807 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.828831 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.931084 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.931153 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.931166 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.931184 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:50 crc kubenswrapper[4880]: I0218 11:52:50.931195 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:50Z","lastTransitionTime":"2026-02-18T11:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.034074 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.034119 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.034131 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.034150 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.034177 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.136439 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.136472 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.136482 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.136497 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.136507 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.171887 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 15:52:11.458750646 +0000 UTC Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.179372 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:51 crc kubenswrapper[4880]: E0218 11:52:51.179556 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.239305 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.239367 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.239384 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.239405 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.239423 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.341892 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.341953 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.341968 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.341987 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.342003 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.445282 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.445314 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.445323 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.445335 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.445344 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.551292 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.551380 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.551399 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.551423 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.551440 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.654721 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.655075 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.655175 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.655246 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.655313 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.758866 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.759173 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.759259 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.759343 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.759415 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.862270 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.862383 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.862396 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.862450 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.862461 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.965566 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.965686 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.965708 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.965738 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:51 crc kubenswrapper[4880]: I0218 11:52:51.965774 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:51Z","lastTransitionTime":"2026-02-18T11:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.069496 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.069548 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.069562 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.069585 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.069599 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.172500 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 01:45:21.893003504 +0000 UTC Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.173909 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.174147 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.174306 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.174469 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.174704 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.179171 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.179226 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.179237 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:52 crc kubenswrapper[4880]: E0218 11:52:52.179342 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:52 crc kubenswrapper[4880]: E0218 11:52:52.179439 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:52 crc kubenswrapper[4880]: E0218 11:52:52.179525 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.280018 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.280084 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.280103 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.280130 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.280150 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.384679 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.384749 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.384763 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.384809 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.384824 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.487969 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.488017 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.488034 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.488222 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.488240 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.591525 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.591583 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.591599 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.591686 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.591708 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.693651 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.693692 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.693702 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.693714 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.693723 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.796635 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.796976 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.797061 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.797165 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.797245 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.900106 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.900156 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.900167 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.900185 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:52 crc kubenswrapper[4880]: I0218 11:52:52.900195 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:52Z","lastTransitionTime":"2026-02-18T11:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.002758 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.002795 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.002804 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.002822 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.002831 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.106161 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.106253 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.106268 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.106315 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.106329 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.173245 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 19:29:46.285114404 +0000 UTC Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.178842 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:53 crc kubenswrapper[4880]: E0218 11:52:53.178989 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.194306 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec9545747663880bf83a602f1fc4e7b8f97279573e433fd594ffca264cbb1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.210099 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.210205 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.210223 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.210276 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.210295 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.213583 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://759c147434938329e8d29625f1163a51ceb821264e0a4ba9dc258a11383dffe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://336aecfd52f090dcdb98cb9f41e95d07b462f37cec894e4b77a82099888a13fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.228638 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.246382 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe51086c-95bb-47e7-a56d-4bb9437c7b1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b47b8ebe9334d2b6fda43ed59aa2d6f09c3a06376bacde5fd243b96b2fcc36d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f1595328b9ec64bf85eaefd9036d238ede2d122e7999779bc27966d39a8c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98sjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q4njv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.259267 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6981df1-6d75-41e2-a41e-ac960f0a847a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drpqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:52:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nj7dq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.274531 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c79fb0e-fda9-432a-aa44-7d013393deee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f59e6228c0008882a58e27dda27703242aa49db4db2d03be0cbe9fe055567e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://770ed0dcb2943b2a6110fab48f3531d9af62bf5f9e9545bfea3400ebf6b6c564\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.290397 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d9cf8f3-3612-4841-8023-0a600364055c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98bc74a727e04a243bf7997b33063d83498ecf69dfdfd392a893074d539f1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e2c924d27f3e8bd11dce4e0ba798de45291bbe49a8cbd20eb59dec2f5c50a93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5352e71219622e0279b5e1347ac9bfe718178f05c3d7eba037d75bf4bc865a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.304472 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61599471af03e70b78f0b3f4c589952472b3d915833faf8344da03b222181ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.313548 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.313585 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.313596 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.313630 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.313643 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.319576 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mh8wn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:41Z\\\",\\\"message\\\":\\\"2026-02-18T11:51:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9\\\\n2026-02-18T11:51:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f52d4378-2cb6-4166-9f62-b9f5733964f9 to /host/opt/cni/bin/\\\\n2026-02-18T11:51:56Z [verbose] multus-daemon started\\\\n2026-02-18T11:51:56Z [verbose] Readiness Indicator file check\\\\n2026-02-18T11:52:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mh8wn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.332657 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf5fee6a-c0f1-43c5-8991-cc078ccb904d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cbdac453d0898b8c107d568d319eaf00e311db8404d5db9093fb9858f4a8c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bctkq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c8jsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.352094 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26dv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d362ee-8a6b-4eaa-8ccd-5a0ab29a0157\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da2137ab548f5ccfde5c03c9d14e4c50ef3785ab1e66b23eb4eb38db78c479e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc8ecd5f9f07514c2f916b4bedc43071497fa58b442587e28638d9c6b999599\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2790f1ca00f50a361d78d4997ff8c7c8e4f78fc04b7bcec32ecd99890d8b8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef96d48ce6673d45e1a5a7b2137fa64441bb9004026bd11c7f2e4787551273c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc6ad63330511b96695e85fc8a2b00c1f8ec7d9af6526bb585555ebe8e07132\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a31371dbdfa0feef933fb341d17a43eb6502eed295fe2c3346a5e669a5cf7025\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea20063b268e813456808ffc61c9e5e57568c164ab15311804689781ec34c367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jqn9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26dv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.374773 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7baa9e97-9a09-417f-ba64-82e5e5f4276d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0218 11:51:52.037519 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0218 11:51:52.040384 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040493 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0218 11:51:52.040500 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1743970407/tls.crt::/tmp/serving-cert-1743970407/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771415496\\\\\\\\\\\\\\\" (2026-02-18 11:51:36 +0000 UTC to 2026-03-20 11:51:37 +0000 UTC (now=2026-02-18 11:51:52.040422348 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040517 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040551 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0218 11:51:52.040573 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0218 11:51:52.040578 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0218 11:51:52.040665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771415497\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771415497\\\\\\\\\\\\\\\" (2026-02-18 10:51:37 +0000 UTC to 2027-02-18 10:51:37 +0000 UTC (now=2026-02-18 11:51:52.040635433 +0000 UTC))\\\\\\\"\\\\nI0218 11:51:52.040690 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nF0218 11:51:52.040713 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.388853 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.403147 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.414507 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6zfpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6bd757b-e4ff-456b-8531-a8cdf2095d1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748aa199901670e7edcedf891a349f16e37d9460b47f6a848ba58f071260fd67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dvl9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6zfpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.416237 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.416273 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.416283 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.416301 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.416311 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.435649 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49768c5d-83e5-4dea-8f86-b57637fc5774\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da84c66c61d8a6facaae08da45179b123f4f7e56cae7d411079c76127a6b894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://469dd0209b77517119bc6e558cedfbcbd4312f20e9f87f880557c7568338b413\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7459edb0eb4b8cbf8ad07509bda78087591c903c8f950f47a0b24c2b8b5351cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15495ff9c0aac692c2eed8488e76752db47c25d8b0891a794dd27a86ac4a0985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7926c19fba027b0182ab1b7ce91e9b2dfe30e03e9f46647911d382a84a9e683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3548f2694ace290c3f61611a9eb8932fa5fbed8fd6b7c08b722b59fe9797c36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e810070b18e2379ac3b35f019b24b998f5d3119aa546b413e84b7ce8a8248de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ed1fd98bb9ab5761470742cbbe69aca876ac9c1469fd4740d6399104ce0527\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.447879 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfpgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c6371a-6404-4edc-9073-8b61c332a6f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de8e650c187841f14269e1a9c91702b3fd18ef9eafcc4b9f8f33a50226092531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqwzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfpgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.474394 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ff5dd18-cbe8-4a79-9518-9786a3521131\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-18T11:52:47Z\\\",\\\"message\\\":\\\"s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130149 6912 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130265 6912 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.130671 6912 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0218 11:52:47.131341 6912 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0218 11:52:47.131366 6912 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0218 11:52:47.131384 6912 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0218 11:52:47.131399 6912 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0218 11:52:47.131401 6912 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0218 11:52:47.131406 6912 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0218 11:52:47.131418 6912 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0218 11:52:47.131427 6912 handler.go:208] Removed *v1.Node event handler 2\\\\nI0218 11:52:47.131432 6912 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0218 11:52:47.131447 6912 factory.go:656] Stopping watch factory\\\\nI0218 11:52:47.131465 6912 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T11:52:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzfkz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6jxd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.488092 4880 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d8ff70e-3dd8-48c2-9348-b963de20a0da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T11:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237f827f8034d23febe06cc158b9ba7686a266e2d52daae9023cc7b39f89730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78e2154ad25bd018654841b56e08f81754a97d473e8f089667f243fbb4dab3c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d9626f8acf726acc784c8338eed3d7124dc7a5324664ccbe3dca90b1ab06e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T11:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8194228e990c2b099094408ead314a3cf0c52aa53f195ddc05b5af7bbf544c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T11:51:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T11:51:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T11:51:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T11:52:53Z is after 2025-08-24T17:21:41Z" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.518668 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.518736 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.518753 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.518776 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.518789 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.621658 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.621701 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.621713 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.621733 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.621751 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.725895 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.725945 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.725959 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.725977 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.725992 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.829860 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.829916 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.829928 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.829952 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.829968 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.934858 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.934953 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.934992 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.935043 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:53 crc kubenswrapper[4880]: I0218 11:52:53.935073 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:53Z","lastTransitionTime":"2026-02-18T11:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.038721 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.038912 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.038930 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.038953 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.038965 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.142893 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.142969 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.142985 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.143017 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.143035 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.173793 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 05:22:59.180690939 +0000 UTC Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.179190 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.179287 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.179363 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:54 crc kubenswrapper[4880]: E0218 11:52:54.179405 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:54 crc kubenswrapper[4880]: E0218 11:52:54.179556 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:54 crc kubenswrapper[4880]: E0218 11:52:54.179876 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.246215 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.246291 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.246307 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.246339 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.246355 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.350242 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.350352 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.350374 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.350438 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.350461 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.453588 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.453686 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.453706 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.453733 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.453753 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.557560 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.557695 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.557723 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.557759 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.557782 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.661568 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.661656 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.661666 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.661680 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.661691 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.765529 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.765577 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.765588 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.765625 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.765637 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.869228 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.869279 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.869290 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.869309 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.869350 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.973167 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.973211 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.973225 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.973246 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:54 crc kubenswrapper[4880]: I0218 11:52:54.973258 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:54Z","lastTransitionTime":"2026-02-18T11:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.082451 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.082512 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.082528 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.082551 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.082568 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.174480 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 02:33:29.907540368 +0000 UTC Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.179081 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:55 crc kubenswrapper[4880]: E0218 11:52:55.179369 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.185643 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.185713 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.185731 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.185760 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.185778 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.288691 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.288745 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.288757 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.288775 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.288787 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.391649 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.391706 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.391723 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.391745 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.391762 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.494452 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.494506 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.494523 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.494541 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.494554 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.597903 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.597950 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.597961 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.597977 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.597987 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.701179 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.701223 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.701234 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.701251 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.701260 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.803756 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.803826 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.803848 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.803882 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.803923 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.907372 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.907463 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.907487 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.907515 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.907535 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:55Z","lastTransitionTime":"2026-02-18T11:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.967233 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.967372 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:55 crc kubenswrapper[4880]: E0218 11:52:55.967404 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.967374437 +0000 UTC m=+147.396275298 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:52:55 crc kubenswrapper[4880]: I0218 11:52:55.967514 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:55 crc kubenswrapper[4880]: E0218 11:52:55.967715 4880 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:52:55 crc kubenswrapper[4880]: E0218 11:52:55.967741 4880 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:52:55 crc kubenswrapper[4880]: E0218 11:52:55.967808 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.9677887 +0000 UTC m=+147.396689561 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 11:52:55 crc kubenswrapper[4880]: E0218 11:52:55.967930 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.967883142 +0000 UTC m=+147.396784203 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.010657 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.010730 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.010744 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.010761 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.010772 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.069201 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.069771 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.069467 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.070073 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.070092 4880 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.070163 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.070142601 +0000 UTC m=+147.499043682 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.069879 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.070324 4880 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.070333 4880 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.070357 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.070350297 +0000 UTC m=+147.499251158 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.114137 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.114200 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.114214 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.114234 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.114246 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.175490 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:58:36.591482717 +0000 UTC Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.178869 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.178946 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.178995 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.178869 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.179093 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:56 crc kubenswrapper[4880]: E0218 11:52:56.179165 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.217021 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.217056 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.217066 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.217080 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.217089 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.320585 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.320693 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.320706 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.320721 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.320731 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.423914 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.424507 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.424538 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.424570 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.424640 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.528111 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.528176 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.528191 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.528218 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.528233 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.631951 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.632024 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.632042 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.632066 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.632082 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.735097 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.735142 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.735155 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.735175 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.735188 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.837953 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.837992 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.838001 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.838015 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.838024 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.941012 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.941051 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.941062 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.941078 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:56 crc kubenswrapper[4880]: I0218 11:52:56.941090 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:56Z","lastTransitionTime":"2026-02-18T11:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.043066 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.043101 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.043127 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.043143 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.043154 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.147143 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.147193 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.147215 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.147234 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.147247 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.176150 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 11:23:27.901055988 +0000 UTC Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.179551 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:57 crc kubenswrapper[4880]: E0218 11:52:57.179708 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.250691 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.250764 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.250783 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.250813 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.250833 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.354205 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.354241 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.354249 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.354261 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.354271 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.459091 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.459164 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.459179 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.459207 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.459222 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.562040 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.562101 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.562125 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.562150 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.562168 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.665298 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.665377 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.665399 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.665433 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.665455 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.768009 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.768055 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.768069 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.768087 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.768099 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.870808 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.871129 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.871270 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.871393 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.871526 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.975093 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.975135 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.975146 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.975165 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:57 crc kubenswrapper[4880]: I0218 11:52:57.975176 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:57Z","lastTransitionTime":"2026-02-18T11:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.079937 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.079995 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.080009 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.080026 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.080036 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.177321 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 14:19:04.992827749 +0000 UTC Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.179660 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.179684 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.179667 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:52:58 crc kubenswrapper[4880]: E0218 11:52:58.179811 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:52:58 crc kubenswrapper[4880]: E0218 11:52:58.179886 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:52:58 crc kubenswrapper[4880]: E0218 11:52:58.180021 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.182176 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.182208 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.182218 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.182234 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.182247 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.285859 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.285900 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.285909 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.285923 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.285933 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.388160 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.388198 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.388207 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.388220 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.388229 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.491450 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.491529 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.491554 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.491645 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.491666 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.595237 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.595353 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.595382 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.595411 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.595432 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.697960 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.697996 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.698009 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.698060 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.698074 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.801090 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.801182 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.801202 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.801225 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.801316 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.904550 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.904637 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.904655 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.904683 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:58 crc kubenswrapper[4880]: I0218 11:52:58.904698 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:58Z","lastTransitionTime":"2026-02-18T11:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.007253 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.007289 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.007301 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.007320 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.007331 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.109021 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.109088 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.109108 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.109135 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.109153 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.178270 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:16:44.434188786 +0000 UTC Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.179695 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:52:59 crc kubenswrapper[4880]: E0218 11:52:59.180208 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.180425 4880 scope.go:117] "RemoveContainer" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" Feb 18 11:52:59 crc kubenswrapper[4880]: E0218 11:52:59.180596 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.211854 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.212278 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.212401 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.212497 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.212576 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.315127 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.315254 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.315290 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.315325 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.315351 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.417911 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.417957 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.417969 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.417985 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.417996 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.521057 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.521405 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.521521 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.521640 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.521741 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.624518 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.624563 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.624575 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.624593 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.624629 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.727355 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.727639 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.727751 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.727840 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.727920 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.830422 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.830468 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.830479 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.830496 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.830510 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.932844 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.932894 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.932904 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.932919 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:52:59 crc kubenswrapper[4880]: I0218 11:52:59.932929 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:52:59Z","lastTransitionTime":"2026-02-18T11:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.035245 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.035276 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.035286 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.035300 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.035309 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:53:00Z","lastTransitionTime":"2026-02-18T11:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.137260 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.137298 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.137308 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.137323 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.137333 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:53:00Z","lastTransitionTime":"2026-02-18T11:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.178719 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 21:32:29.741987671 +0000 UTC Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.178873 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.178874 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:00 crc kubenswrapper[4880]: E0218 11:53:00.178991 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:00 crc kubenswrapper[4880]: E0218 11:53:00.179043 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.178873 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:00 crc kubenswrapper[4880]: E0218 11:53:00.179105 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.239934 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.239984 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.240012 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.240030 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.240074 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:53:00Z","lastTransitionTime":"2026-02-18T11:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.342825 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.342868 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.342879 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.342894 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.342903 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:53:00Z","lastTransitionTime":"2026-02-18T11:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.396928 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.396982 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.396994 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.397013 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.397025 4880 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T11:53:00Z","lastTransitionTime":"2026-02-18T11:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.442170 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf"] Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.442846 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.445221 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.445414 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.445758 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.447002 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.470888 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-26dv6" podStartSLOduration=67.47086675 podStartE2EDuration="1m7.47086675s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.470776268 +0000 UTC m=+87.899677139" watchObservedRunningTime="2026-02-18 11:53:00.47086675 +0000 UTC m=+87.899767611" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.512202 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.512264 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.512313 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.512340 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.512431 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.513999 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.513983197 podStartE2EDuration="1m8.513983197s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.501370021 +0000 UTC m=+87.930270912" watchObservedRunningTime="2026-02-18 11:53:00.513983197 +0000 UTC m=+87.942884058" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.514291 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.514286617 podStartE2EDuration="1m7.514286617s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.514152593 +0000 UTC m=+87.943053444" watchObservedRunningTime="2026-02-18 11:53:00.514286617 +0000 UTC m=+87.943187478" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.542868 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mh8wn" podStartSLOduration=68.542845093 podStartE2EDuration="1m8.542845093s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.542072562 +0000 UTC m=+87.970973453" watchObservedRunningTime="2026-02-18 11:53:00.542845093 +0000 UTC m=+87.971745954" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.557019 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podStartSLOduration=68.557002003 podStartE2EDuration="1m8.557002003s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.556971932 +0000 UTC m=+87.985872793" watchObservedRunningTime="2026-02-18 11:53:00.557002003 +0000 UTC m=+87.985902864" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.579493 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.579473438 podStartE2EDuration="1m8.579473438s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.579108778 +0000 UTC m=+88.008009639" watchObservedRunningTime="2026-02-18 11:53:00.579473438 +0000 UTC m=+88.008374299" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.613675 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.613733 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.613760 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.613800 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.613828 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.613902 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.614069 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.614751 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.615768 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6zfpm" podStartSLOduration=68.615756023 podStartE2EDuration="1m8.615756023s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.61423335 +0000 UTC m=+88.043134211" watchObservedRunningTime="2026-02-18 11:53:00.615756023 +0000 UTC m=+88.044656874" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.620126 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.627911 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=33.627894566 podStartE2EDuration="33.627894566s" podCreationTimestamp="2026-02-18 11:52:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.62735424 +0000 UTC m=+88.056255101" watchObservedRunningTime="2026-02-18 11:53:00.627894566 +0000 UTC m=+88.056795427" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.632943 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62ec5f1c-9031-4c18-91d6-4d29ee0038b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qw4zf\" (UID: \"62ec5f1c-9031-4c18-91d6-4d29ee0038b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.639124 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bfpgn" podStartSLOduration=68.639106513 podStartE2EDuration="1m8.639106513s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.638475754 +0000 UTC m=+88.067376615" watchObservedRunningTime="2026-02-18 11:53:00.639106513 +0000 UTC m=+88.068007374" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.683973 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=15.68395726 podStartE2EDuration="15.68395726s" podCreationTimestamp="2026-02-18 11:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.683186338 +0000 UTC m=+88.112087199" watchObservedRunningTime="2026-02-18 11:53:00.68395726 +0000 UTC m=+88.112858111" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.730475 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q4njv" podStartSLOduration=67.730458173 podStartE2EDuration="1m7.730458173s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:00.730048142 +0000 UTC m=+88.158949003" watchObservedRunningTime="2026-02-18 11:53:00.730458173 +0000 UTC m=+88.159359034" Feb 18 11:53:00 crc kubenswrapper[4880]: I0218 11:53:00.757751 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" Feb 18 11:53:01 crc kubenswrapper[4880]: I0218 11:53:01.179679 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 16:55:05.47662433 +0000 UTC Feb 18 11:53:01 crc kubenswrapper[4880]: I0218 11:53:01.179971 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 18 11:53:01 crc kubenswrapper[4880]: I0218 11:53:01.179814 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:01 crc kubenswrapper[4880]: E0218 11:53:01.180201 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:01 crc kubenswrapper[4880]: I0218 11:53:01.189004 4880 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 11:53:01 crc kubenswrapper[4880]: I0218 11:53:01.636715 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" event={"ID":"62ec5f1c-9031-4c18-91d6-4d29ee0038b3","Type":"ContainerStarted","Data":"085f2ee31a88cd1cb2a1d88c21c49d2c1b869475aef69eaf36ce20efd86fe6db"} Feb 18 11:53:01 crc kubenswrapper[4880]: I0218 11:53:01.636763 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" event={"ID":"62ec5f1c-9031-4c18-91d6-4d29ee0038b3","Type":"ContainerStarted","Data":"4e8e269c5b62b3def1b4dfe5f93f737db2e87d1c8f658668dedaceafba96fc32"} Feb 18 11:53:01 crc kubenswrapper[4880]: I0218 11:53:01.649128 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qw4zf" podStartSLOduration=69.649109525 podStartE2EDuration="1m9.649109525s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:01.648753165 +0000 UTC m=+89.077654026" watchObservedRunningTime="2026-02-18 11:53:01.649109525 +0000 UTC m=+89.078010386" Feb 18 11:53:02 crc kubenswrapper[4880]: I0218 11:53:02.178856 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:02 crc kubenswrapper[4880]: I0218 11:53:02.178893 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:02 crc kubenswrapper[4880]: I0218 11:53:02.178857 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:02 crc kubenswrapper[4880]: E0218 11:53:02.179000 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:02 crc kubenswrapper[4880]: E0218 11:53:02.179117 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:02 crc kubenswrapper[4880]: E0218 11:53:02.179252 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:03 crc kubenswrapper[4880]: I0218 11:53:03.179134 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:03 crc kubenswrapper[4880]: E0218 11:53:03.180025 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:04 crc kubenswrapper[4880]: I0218 11:53:04.178854 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:04 crc kubenswrapper[4880]: I0218 11:53:04.178910 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:04 crc kubenswrapper[4880]: E0218 11:53:04.178970 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:04 crc kubenswrapper[4880]: I0218 11:53:04.178980 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:04 crc kubenswrapper[4880]: E0218 11:53:04.179068 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:04 crc kubenswrapper[4880]: E0218 11:53:04.179152 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:05 crc kubenswrapper[4880]: I0218 11:53:05.178975 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:05 crc kubenswrapper[4880]: E0218 11:53:05.179196 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:06 crc kubenswrapper[4880]: I0218 11:53:06.179559 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:06 crc kubenswrapper[4880]: I0218 11:53:06.179596 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:06 crc kubenswrapper[4880]: I0218 11:53:06.179623 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:06 crc kubenswrapper[4880]: E0218 11:53:06.179822 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:06 crc kubenswrapper[4880]: E0218 11:53:06.179990 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:06 crc kubenswrapper[4880]: E0218 11:53:06.180102 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:07 crc kubenswrapper[4880]: I0218 11:53:07.179197 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:07 crc kubenswrapper[4880]: E0218 11:53:07.179321 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:08 crc kubenswrapper[4880]: I0218 11:53:08.179239 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:08 crc kubenswrapper[4880]: I0218 11:53:08.179303 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:08 crc kubenswrapper[4880]: E0218 11:53:08.179399 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:08 crc kubenswrapper[4880]: I0218 11:53:08.179239 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:08 crc kubenswrapper[4880]: E0218 11:53:08.179498 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:08 crc kubenswrapper[4880]: E0218 11:53:08.179563 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:09 crc kubenswrapper[4880]: I0218 11:53:09.180079 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:09 crc kubenswrapper[4880]: E0218 11:53:09.180855 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:10 crc kubenswrapper[4880]: I0218 11:53:10.179160 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:10 crc kubenswrapper[4880]: I0218 11:53:10.179197 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:10 crc kubenswrapper[4880]: E0218 11:53:10.179268 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:10 crc kubenswrapper[4880]: I0218 11:53:10.179288 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:10 crc kubenswrapper[4880]: E0218 11:53:10.179390 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:10 crc kubenswrapper[4880]: E0218 11:53:10.179455 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:11 crc kubenswrapper[4880]: I0218 11:53:11.178723 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:11 crc kubenswrapper[4880]: E0218 11:53:11.179294 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:11 crc kubenswrapper[4880]: I0218 11:53:11.227538 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:11 crc kubenswrapper[4880]: E0218 11:53:11.227855 4880 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:53:11 crc kubenswrapper[4880]: E0218 11:53:11.227914 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs podName:c6981df1-6d75-41e2-a41e-ac960f0a847a nodeName:}" failed. No retries permitted until 2026-02-18 11:54:15.227901153 +0000 UTC m=+162.656802014 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs") pod "network-metrics-daemon-nj7dq" (UID: "c6981df1-6d75-41e2-a41e-ac960f0a847a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 11:53:12 crc kubenswrapper[4880]: I0218 11:53:12.179095 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:12 crc kubenswrapper[4880]: E0218 11:53:12.179260 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:12 crc kubenswrapper[4880]: I0218 11:53:12.179272 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:12 crc kubenswrapper[4880]: I0218 11:53:12.179299 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:12 crc kubenswrapper[4880]: E0218 11:53:12.179396 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:12 crc kubenswrapper[4880]: E0218 11:53:12.179485 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:13 crc kubenswrapper[4880]: I0218 11:53:13.179663 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:13 crc kubenswrapper[4880]: E0218 11:53:13.180044 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:13 crc kubenswrapper[4880]: I0218 11:53:13.180240 4880 scope.go:117] "RemoveContainer" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" Feb 18 11:53:13 crc kubenswrapper[4880]: E0218 11:53:13.180458 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:53:14 crc kubenswrapper[4880]: I0218 11:53:14.179082 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:14 crc kubenswrapper[4880]: I0218 11:53:14.179108 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:14 crc kubenswrapper[4880]: I0218 11:53:14.179157 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:14 crc kubenswrapper[4880]: E0218 11:53:14.179207 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:14 crc kubenswrapper[4880]: E0218 11:53:14.179310 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:14 crc kubenswrapper[4880]: E0218 11:53:14.179388 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:15 crc kubenswrapper[4880]: I0218 11:53:15.179738 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:15 crc kubenswrapper[4880]: E0218 11:53:15.179900 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:16 crc kubenswrapper[4880]: I0218 11:53:16.179069 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:16 crc kubenswrapper[4880]: I0218 11:53:16.179120 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:16 crc kubenswrapper[4880]: E0218 11:53:16.179267 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:16 crc kubenswrapper[4880]: I0218 11:53:16.179296 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:16 crc kubenswrapper[4880]: E0218 11:53:16.179407 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:16 crc kubenswrapper[4880]: E0218 11:53:16.179472 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:17 crc kubenswrapper[4880]: I0218 11:53:17.179730 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:17 crc kubenswrapper[4880]: E0218 11:53:17.179850 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:18 crc kubenswrapper[4880]: I0218 11:53:18.178858 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:18 crc kubenswrapper[4880]: I0218 11:53:18.178871 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:18 crc kubenswrapper[4880]: E0218 11:53:18.179098 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:18 crc kubenswrapper[4880]: E0218 11:53:18.178984 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:18 crc kubenswrapper[4880]: I0218 11:53:18.178871 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:18 crc kubenswrapper[4880]: E0218 11:53:18.179166 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:19 crc kubenswrapper[4880]: I0218 11:53:19.179093 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:19 crc kubenswrapper[4880]: E0218 11:53:19.179409 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:20 crc kubenswrapper[4880]: I0218 11:53:20.178746 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:20 crc kubenswrapper[4880]: I0218 11:53:20.178915 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:20 crc kubenswrapper[4880]: E0218 11:53:20.179005 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:20 crc kubenswrapper[4880]: I0218 11:53:20.179024 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:20 crc kubenswrapper[4880]: E0218 11:53:20.179175 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:20 crc kubenswrapper[4880]: E0218 11:53:20.179314 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:21 crc kubenswrapper[4880]: I0218 11:53:21.178982 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:21 crc kubenswrapper[4880]: E0218 11:53:21.179135 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:22 crc kubenswrapper[4880]: I0218 11:53:22.179032 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:22 crc kubenswrapper[4880]: I0218 11:53:22.179089 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:22 crc kubenswrapper[4880]: I0218 11:53:22.179032 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:22 crc kubenswrapper[4880]: E0218 11:53:22.179175 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:22 crc kubenswrapper[4880]: E0218 11:53:22.179241 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:22 crc kubenswrapper[4880]: E0218 11:53:22.179483 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:23 crc kubenswrapper[4880]: I0218 11:53:23.178858 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:23 crc kubenswrapper[4880]: E0218 11:53:23.180096 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:24 crc kubenswrapper[4880]: I0218 11:53:24.179021 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:24 crc kubenswrapper[4880]: I0218 11:53:24.179065 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:24 crc kubenswrapper[4880]: I0218 11:53:24.179031 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:24 crc kubenswrapper[4880]: E0218 11:53:24.179166 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:24 crc kubenswrapper[4880]: E0218 11:53:24.179234 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:24 crc kubenswrapper[4880]: E0218 11:53:24.179285 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:25 crc kubenswrapper[4880]: I0218 11:53:25.178767 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:25 crc kubenswrapper[4880]: E0218 11:53:25.179280 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:25 crc kubenswrapper[4880]: I0218 11:53:25.179651 4880 scope.go:117] "RemoveContainer" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" Feb 18 11:53:25 crc kubenswrapper[4880]: E0218 11:53:25.179832 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-6jxd5_openshift-ovn-kubernetes(0ff5dd18-cbe8-4a79-9518-9786a3521131)\"" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" Feb 18 11:53:26 crc kubenswrapper[4880]: I0218 11:53:26.178961 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:26 crc kubenswrapper[4880]: I0218 11:53:26.179103 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:26 crc kubenswrapper[4880]: I0218 11:53:26.179133 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:26 crc kubenswrapper[4880]: E0218 11:53:26.179253 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:26 crc kubenswrapper[4880]: E0218 11:53:26.179375 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:26 crc kubenswrapper[4880]: E0218 11:53:26.179525 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:27 crc kubenswrapper[4880]: I0218 11:53:27.179330 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:27 crc kubenswrapper[4880]: E0218 11:53:27.179444 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.179113 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.179295 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.179323 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:28 crc kubenswrapper[4880]: E0218 11:53:28.179405 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:28 crc kubenswrapper[4880]: E0218 11:53:28.179689 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:28 crc kubenswrapper[4880]: E0218 11:53:28.179755 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.715840 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/1.log" Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.716739 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/0.log" Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.716805 4880 generic.go:334] "Generic (PLEG): container finished" podID="3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8" containerID="6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36" exitCode=1 Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.716847 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerDied","Data":"6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36"} Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.716896 4880 scope.go:117] "RemoveContainer" containerID="4d10b08569c43e8e3cfb573c63f6997a1db12c63afb0fb51bf417ff7b309315d" Feb 18 11:53:28 crc kubenswrapper[4880]: I0218 11:53:28.717461 4880 scope.go:117] "RemoveContainer" containerID="6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36" Feb 18 11:53:28 crc kubenswrapper[4880]: E0218 11:53:28.717885 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mh8wn_openshift-multus(3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8)\"" pod="openshift-multus/multus-mh8wn" podUID="3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8" Feb 18 11:53:29 crc kubenswrapper[4880]: I0218 11:53:29.179697 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:29 crc kubenswrapper[4880]: E0218 11:53:29.179889 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:29 crc kubenswrapper[4880]: I0218 11:53:29.720112 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/1.log" Feb 18 11:53:30 crc kubenswrapper[4880]: I0218 11:53:30.179018 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:30 crc kubenswrapper[4880]: I0218 11:53:30.179158 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:30 crc kubenswrapper[4880]: E0218 11:53:30.179459 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:30 crc kubenswrapper[4880]: I0218 11:53:30.179158 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:30 crc kubenswrapper[4880]: E0218 11:53:30.180292 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:30 crc kubenswrapper[4880]: E0218 11:53:30.180449 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:31 crc kubenswrapper[4880]: I0218 11:53:31.178704 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:31 crc kubenswrapper[4880]: E0218 11:53:31.178856 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:32 crc kubenswrapper[4880]: I0218 11:53:32.179536 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:32 crc kubenswrapper[4880]: I0218 11:53:32.179536 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:32 crc kubenswrapper[4880]: I0218 11:53:32.179497 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:32 crc kubenswrapper[4880]: E0218 11:53:32.179774 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:32 crc kubenswrapper[4880]: E0218 11:53:32.179854 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:32 crc kubenswrapper[4880]: E0218 11:53:32.179932 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:33 crc kubenswrapper[4880]: I0218 11:53:33.179199 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:33 crc kubenswrapper[4880]: E0218 11:53:33.179898 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:33 crc kubenswrapper[4880]: E0218 11:53:33.194030 4880 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 18 11:53:33 crc kubenswrapper[4880]: E0218 11:53:33.277583 4880 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:53:34 crc kubenswrapper[4880]: I0218 11:53:34.179320 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:34 crc kubenswrapper[4880]: I0218 11:53:34.179368 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:34 crc kubenswrapper[4880]: E0218 11:53:34.179554 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:34 crc kubenswrapper[4880]: I0218 11:53:34.179367 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:34 crc kubenswrapper[4880]: E0218 11:53:34.179730 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:34 crc kubenswrapper[4880]: E0218 11:53:34.179864 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:35 crc kubenswrapper[4880]: I0218 11:53:35.179007 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:35 crc kubenswrapper[4880]: E0218 11:53:35.179598 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:36 crc kubenswrapper[4880]: I0218 11:53:36.179502 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:36 crc kubenswrapper[4880]: I0218 11:53:36.179546 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:36 crc kubenswrapper[4880]: I0218 11:53:36.179535 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:36 crc kubenswrapper[4880]: E0218 11:53:36.179693 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:36 crc kubenswrapper[4880]: E0218 11:53:36.179738 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:36 crc kubenswrapper[4880]: E0218 11:53:36.179842 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:37 crc kubenswrapper[4880]: I0218 11:53:37.179150 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:37 crc kubenswrapper[4880]: E0218 11:53:37.179340 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:38 crc kubenswrapper[4880]: I0218 11:53:38.178700 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:38 crc kubenswrapper[4880]: I0218 11:53:38.179068 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:38 crc kubenswrapper[4880]: E0218 11:53:38.179280 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:38 crc kubenswrapper[4880]: I0218 11:53:38.179689 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:38 crc kubenswrapper[4880]: E0218 11:53:38.179856 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:38 crc kubenswrapper[4880]: E0218 11:53:38.180122 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:38 crc kubenswrapper[4880]: E0218 11:53:38.279497 4880 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:53:39 crc kubenswrapper[4880]: I0218 11:53:39.179242 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:39 crc kubenswrapper[4880]: I0218 11:53:39.180167 4880 scope.go:117] "RemoveContainer" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" Feb 18 11:53:39 crc kubenswrapper[4880]: E0218 11:53:39.180278 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:39 crc kubenswrapper[4880]: I0218 11:53:39.752069 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/3.log" Feb 18 11:53:39 crc kubenswrapper[4880]: I0218 11:53:39.755462 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerStarted","Data":"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c"} Feb 18 11:53:39 crc kubenswrapper[4880]: I0218 11:53:39.756401 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:53:39 crc kubenswrapper[4880]: I0218 11:53:39.789146 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podStartSLOduration=106.78912442 podStartE2EDuration="1m46.78912442s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:39.7880524 +0000 UTC m=+127.216953291" watchObservedRunningTime="2026-02-18 11:53:39.78912442 +0000 UTC m=+127.218025281" Feb 18 11:53:40 crc kubenswrapper[4880]: I0218 11:53:40.179482 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:40 crc kubenswrapper[4880]: I0218 11:53:40.179551 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:40 crc kubenswrapper[4880]: I0218 11:53:40.179514 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:40 crc kubenswrapper[4880]: E0218 11:53:40.179733 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:40 crc kubenswrapper[4880]: E0218 11:53:40.179895 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:40 crc kubenswrapper[4880]: E0218 11:53:40.180044 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:40 crc kubenswrapper[4880]: I0218 11:53:40.213514 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nj7dq"] Feb 18 11:53:40 crc kubenswrapper[4880]: I0218 11:53:40.213661 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:40 crc kubenswrapper[4880]: E0218 11:53:40.213953 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:42 crc kubenswrapper[4880]: I0218 11:53:42.179380 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:42 crc kubenswrapper[4880]: I0218 11:53:42.179440 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:42 crc kubenswrapper[4880]: E0218 11:53:42.180219 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:42 crc kubenswrapper[4880]: I0218 11:53:42.179520 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:42 crc kubenswrapper[4880]: I0218 11:53:42.179462 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:42 crc kubenswrapper[4880]: E0218 11:53:42.180381 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:42 crc kubenswrapper[4880]: E0218 11:53:42.180504 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:42 crc kubenswrapper[4880]: E0218 11:53:42.180600 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:43 crc kubenswrapper[4880]: I0218 11:53:43.182566 4880 scope.go:117] "RemoveContainer" containerID="6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36" Feb 18 11:53:43 crc kubenswrapper[4880]: E0218 11:53:43.280023 4880 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 11:53:43 crc kubenswrapper[4880]: I0218 11:53:43.772474 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/1.log" Feb 18 11:53:43 crc kubenswrapper[4880]: I0218 11:53:43.772561 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerStarted","Data":"e257872bff6e653d23a87b41f3b00970844c0ededb3938761df49207c05f4ef6"} Feb 18 11:53:44 crc kubenswrapper[4880]: I0218 11:53:44.179133 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:44 crc kubenswrapper[4880]: I0218 11:53:44.179152 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:44 crc kubenswrapper[4880]: E0218 11:53:44.179422 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:44 crc kubenswrapper[4880]: I0218 11:53:44.179185 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:44 crc kubenswrapper[4880]: I0218 11:53:44.179164 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:44 crc kubenswrapper[4880]: E0218 11:53:44.179488 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:44 crc kubenswrapper[4880]: E0218 11:53:44.179606 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:44 crc kubenswrapper[4880]: E0218 11:53:44.179842 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:46 crc kubenswrapper[4880]: I0218 11:53:46.178705 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:46 crc kubenswrapper[4880]: E0218 11:53:46.178831 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:46 crc kubenswrapper[4880]: I0218 11:53:46.178726 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:46 crc kubenswrapper[4880]: I0218 11:53:46.178705 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:46 crc kubenswrapper[4880]: E0218 11:53:46.178910 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:46 crc kubenswrapper[4880]: E0218 11:53:46.181329 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:46 crc kubenswrapper[4880]: I0218 11:53:46.181462 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:46 crc kubenswrapper[4880]: E0218 11:53:46.181597 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:47 crc kubenswrapper[4880]: I0218 11:53:47.157311 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 11:53:48 crc kubenswrapper[4880]: I0218 11:53:48.178816 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:48 crc kubenswrapper[4880]: I0218 11:53:48.178861 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:48 crc kubenswrapper[4880]: E0218 11:53:48.180012 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 11:53:48 crc kubenswrapper[4880]: I0218 11:53:48.178957 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:48 crc kubenswrapper[4880]: E0218 11:53:48.180320 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 11:53:48 crc kubenswrapper[4880]: I0218 11:53:48.178924 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:48 crc kubenswrapper[4880]: E0218 11:53:48.180556 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 11:53:48 crc kubenswrapper[4880]: E0218 11:53:48.180026 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nj7dq" podUID="c6981df1-6d75-41e2-a41e-ac960f0a847a" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.179114 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.179183 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.179114 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.179297 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.182309 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.182365 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.183178 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.183217 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.183550 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 11:53:50 crc kubenswrapper[4880]: I0218 11:53:50.183637 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.323196 4880 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.381103 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vdp59"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.382028 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.382939 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-6qqbh"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.383634 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.384190 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wb62r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.384959 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.388711 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5rcj7"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.389235 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxz72"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.389509 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.389896 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.390387 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.390736 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.391203 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6n78q"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.392078 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.392508 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.393107 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.394402 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.395361 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.404008 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.405707 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.406274 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.406517 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.406717 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.406876 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.407080 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.407181 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.407251 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.407643 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408218 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408426 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408571 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408758 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408779 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408872 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408922 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408884 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.408999 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409080 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409127 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409227 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409320 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409396 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409485 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409695 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409836 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.409856 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.410218 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.410354 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.410407 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.412860 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.413316 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.413732 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.413959 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.414464 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g46bq"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.414728 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.414857 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.415161 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9tqb6"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.415487 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.415650 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.415741 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.416063 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.416455 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.428000 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.457536 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.457912 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.458043 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.459089 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.459543 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.464194 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.464518 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.464920 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.465014 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.465231 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.465353 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.465502 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.464253 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.465958 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.466043 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.464296 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.466256 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.464421 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.466843 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.466978 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.467097 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.467723 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.468570 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.469322 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.469599 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.470766 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.470993 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.471107 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.471218 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.471380 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.471516 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.471659 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.471821 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.473164 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.473847 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.474394 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bj57p"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.475338 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.479826 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.481033 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pcx4q"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.481768 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sqwt7"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.482449 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.483249 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.483780 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.484033 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.484274 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486032 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35491756-4b45-4afa-a300-e8e55fa4b0c4-metrics-tls\") pod \"dns-operator-744455d44c-g46bq\" (UID: \"35491756-4b45-4afa-a300-e8e55fa4b0c4\") " pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486077 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-serving-cert\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486098 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486122 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qscth\" (UniqueName: \"kubernetes.io/projected/9f0337cb-0aed-41af-b587-a16392350413-kube-api-access-qscth\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486142 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7ln\" (UniqueName: \"kubernetes.io/projected/eeba5175-b082-4db4-9db5-57be919585aa-kube-api-access-gt7ln\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486162 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486181 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-audit-policies\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486199 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486218 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-config\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486240 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvngc\" (UniqueName: \"kubernetes.io/projected/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-kube-api-access-cvngc\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486256 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-audit\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486281 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-config\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486297 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0337cb-0aed-41af-b587-a16392350413-serving-cert\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486320 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4153ada5-4d96-41b3-805c-ab5464fc34ba-webhook-cert\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486340 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-serving-cert\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486360 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-oauth-config\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486378 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486398 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-config\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486412 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-serving-cert\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486430 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4b7\" (UniqueName: \"kubernetes.io/projected/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-kube-api-access-lw4b7\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486447 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b634ffd-eba0-480b-8e1a-7e2f26582d07-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486463 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk7xx\" (UniqueName: \"kubernetes.io/projected/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-kube-api-access-tk7xx\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486484 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486506 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-serving-cert\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486530 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbts\" (UniqueName: \"kubernetes.io/projected/35491756-4b45-4afa-a300-e8e55fa4b0c4-kube-api-access-gfbts\") pod \"dns-operator-744455d44c-g46bq\" (UID: \"35491756-4b45-4afa-a300-e8e55fa4b0c4\") " pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486554 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b634ffd-eba0-480b-8e1a-7e2f26582d07-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486579 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfxb\" (UniqueName: \"kubernetes.io/projected/1cb59275-0ea1-401b-b631-cf092f94b742-kube-api-access-znfxb\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486738 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jb5b\" (UniqueName: \"kubernetes.io/projected/71911446-2b7a-45cb-8dc5-277b629ee152-kube-api-access-9jb5b\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486782 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-encryption-config\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486806 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-dir\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486831 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486854 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-policies\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486875 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d05e33ff-25c2-4f7f-809c-7583945b4e7c-audit-dir\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486970 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7f5\" (UniqueName: \"kubernetes.io/projected/4153ada5-4d96-41b3-805c-ab5464fc34ba-kube-api-access-rd7f5\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.486997 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v45b\" (UniqueName: \"kubernetes.io/projected/c5edee35-4ff0-4da6-8db9-8cc26218d3f7-kube-api-access-8v45b\") pod \"cluster-samples-operator-665b6dd947-t4jzt\" (UID: \"c5edee35-4ff0-4da6-8db9-8cc26218d3f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487055 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/71911446-2b7a-45cb-8dc5-277b629ee152-machine-approver-tls\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487125 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d05e33ff-25c2-4f7f-809c-7583945b4e7c-node-pullsecrets\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487180 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvtgq\" (UniqueName: \"kubernetes.io/projected/5d4532b3-caee-400d-992c-023a97f6d0ca-kube-api-access-jvtgq\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487216 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeba5175-b082-4db4-9db5-57be919585aa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487242 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b634ffd-eba0-480b-8e1a-7e2f26582d07-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487270 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487311 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4153ada5-4d96-41b3-805c-ab5464fc34ba-tmpfs\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487340 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487364 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-image-import-ca\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487394 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487425 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-etcd-client\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487449 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487529 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487556 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-client-ca\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487594 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnzb\" (UniqueName: \"kubernetes.io/projected/d05e33ff-25c2-4f7f-809c-7583945b4e7c-kube-api-access-vwnzb\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487664 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flsmp\" (UniqueName: \"kubernetes.io/projected/fbca9013-a5ce-4d98-8552-503f5b2d8f45-kube-api-access-flsmp\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.487985 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4153ada5-4d96-41b3-805c-ab5464fc34ba-apiservice-cert\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488017 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488062 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488086 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71911446-2b7a-45cb-8dc5-277b629ee152-auth-proxy-config\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488264 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-config\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488362 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeba5175-b082-4db4-9db5-57be919585aa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488434 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-service-ca-bundle\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488484 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-config\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488519 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-config\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488550 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-client-ca\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488577 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-audit-dir\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488681 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488711 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhngp\" (UniqueName: \"kubernetes.io/projected/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-kube-api-access-lhngp\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488748 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-images\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488776 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1cb59275-0ea1-401b-b631-cf092f94b742-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488818 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-etcd-serving-ca\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488844 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbca9013-a5ce-4d98-8552-503f5b2d8f45-serving-cert\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488864 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb59275-0ea1-401b-b631-cf092f94b742-serving-cert\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488893 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5edee35-4ff0-4da6-8db9-8cc26218d3f7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-t4jzt\" (UID: \"c5edee35-4ff0-4da6-8db9-8cc26218d3f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488927 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-service-ca\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488952 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.488976 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.489006 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-etcd-client\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.489070 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-oauth-serving-cert\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.489097 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-encryption-config\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.489142 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndqfb\" (UniqueName: \"kubernetes.io/projected/129989af-f3cc-44bf-9779-6688338d3130-kube-api-access-ndqfb\") pod \"downloads-7954f5f757-6n78q\" (UID: \"129989af-f3cc-44bf-9779-6688338d3130\") " pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.489211 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-trusted-ca-bundle\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.489236 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71911446-2b7a-45cb-8dc5-277b629ee152-config\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.489269 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.495882 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-429s2"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.496704 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-45gzd"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.497214 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.497585 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.504893 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xzmw2"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.505703 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.512769 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.513866 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.514385 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.514510 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.523730 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.524041 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.524422 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.525554 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-t9dmw"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.526043 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.526060 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.526639 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.527404 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.531892 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.532400 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.532749 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.532880 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.535687 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.536913 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537206 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537307 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537400 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537474 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537679 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537696 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537842 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537883 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538080 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538108 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538218 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538259 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538354 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538381 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538516 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538519 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.538773 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.537318 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.539144 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.539788 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.540025 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.540146 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.540904 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.559481 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.593568 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.593899 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.594426 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.594950 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595550 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-service-ca-bundle\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595634 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-config\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595686 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595728 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr2t9\" (UniqueName: \"kubernetes.io/projected/3f0bd703-4d46-4dfd-942d-3b554f736833-kube-api-access-cr2t9\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595755 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8561646-dee0-44aa-a718-fdeaaf0ff34b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595768 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595778 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-stats-auth\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595804 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-config\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595827 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-client-ca\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595850 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-audit-dir\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595874 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595884 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.596194 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.596325 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.596930 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-service-ca-bundle\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597317 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597333 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.595899 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhngp\" (UniqueName: \"kubernetes.io/projected/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-kube-api-access-lhngp\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597468 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597478 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-images\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597538 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-serving-cert\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597563 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1cb59275-0ea1-401b-b631-cf092f94b742-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597591 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-etcd-serving-ca\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597630 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbca9013-a5ce-4d98-8552-503f5b2d8f45-serving-cert\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597649 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb59275-0ea1-401b-b631-cf092f94b742-serving-cert\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597667 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5edee35-4ff0-4da6-8db9-8cc26218d3f7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-t4jzt\" (UID: \"c5edee35-4ff0-4da6-8db9-8cc26218d3f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597688 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-service-ca\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597718 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597738 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597757 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-etcd-client\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597775 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-default-certificate\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597808 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-oauth-serving-cert\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597827 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-encryption-config\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597847 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1dfb09-2907-4e18-a008-e74e9fc62482-serving-cert\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597864 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-config\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597887 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndqfb\" (UniqueName: \"kubernetes.io/projected/129989af-f3cc-44bf-9779-6688338d3130-kube-api-access-ndqfb\") pod \"downloads-7954f5f757-6n78q\" (UID: \"129989af-f3cc-44bf-9779-6688338d3130\") " pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597905 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9b90b292-7e22-4eb8-872f-d5552b334604-signing-cabundle\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597927 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/594db65b-f9dd-4b9b-a572-e52e3d5a5875-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-429s2\" (UID: \"594db65b-f9dd-4b9b-a572-e52e3d5a5875\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597945 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9b90b292-7e22-4eb8-872f-d5552b334604-signing-key\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597963 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597978 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-trusted-ca-bundle\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.597993 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71911446-2b7a-45cb-8dc5-277b629ee152-config\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598011 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48xw\" (UniqueName: \"kubernetes.io/projected/9f587184-defe-4d9a-83a8-7cd95f215a55-kube-api-access-g48xw\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598028 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqdj\" (UniqueName: \"kubernetes.io/projected/89f16e7c-79df-4d83-85bf-e714bbd768fc-kube-api-access-qcqdj\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598047 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35491756-4b45-4afa-a300-e8e55fa4b0c4-metrics-tls\") pod \"dns-operator-744455d44c-g46bq\" (UID: \"35491756-4b45-4afa-a300-e8e55fa4b0c4\") " pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598066 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-serving-cert\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598081 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598096 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qscth\" (UniqueName: \"kubernetes.io/projected/9f0337cb-0aed-41af-b587-a16392350413-kube-api-access-qscth\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598115 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7ln\" (UniqueName: \"kubernetes.io/projected/eeba5175-b082-4db4-9db5-57be919585aa-kube-api-access-gt7ln\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598120 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-config\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598134 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj247\" (UniqueName: \"kubernetes.io/projected/9b90b292-7e22-4eb8-872f-d5552b334604-kube-api-access-mj247\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598179 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-metrics-certs\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598231 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598264 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-audit-policies\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598293 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8p7r\" (UniqueName: \"kubernetes.io/projected/fa1dfb09-2907-4e18-a008-e74e9fc62482-kube-api-access-m8p7r\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598328 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598357 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-config\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598386 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f0bd703-4d46-4dfd-942d-3b554f736833-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598413 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8561646-dee0-44aa-a718-fdeaaf0ff34b-config\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598440 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccde08de-7978-48ba-a59b-34a979ab7fa8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598464 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccde08de-7978-48ba-a59b-34a979ab7fa8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598494 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvngc\" (UniqueName: \"kubernetes.io/projected/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-kube-api-access-cvngc\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598523 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-audit\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598554 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-config\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.598586 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0337cb-0aed-41af-b587-a16392350413-serving-cert\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.599822 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.602308 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-etcd-serving-ca\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.602055 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-images\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.602964 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-audit-policies\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.604362 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.601195 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1cb59275-0ea1-401b-b631-cf092f94b742-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608469 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608580 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4153ada5-4d96-41b3-805c-ab5464fc34ba-webhook-cert\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608638 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-serving-cert\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608668 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-oauth-config\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608705 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t96gz\" (UniqueName: \"kubernetes.io/projected/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-kube-api-access-t96gz\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608739 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f0bd703-4d46-4dfd-942d-3b554f736833-srv-cert\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608767 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b634ffd-eba0-480b-8e1a-7e2f26582d07-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608797 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk7xx\" (UniqueName: \"kubernetes.io/projected/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-kube-api-access-tk7xx\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608828 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608855 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-config\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608878 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-serving-cert\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608908 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4b7\" (UniqueName: \"kubernetes.io/projected/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-kube-api-access-lw4b7\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608934 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-serving-cert\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608960 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.608990 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbts\" (UniqueName: \"kubernetes.io/projected/35491756-4b45-4afa-a300-e8e55fa4b0c4-kube-api-access-gfbts\") pod \"dns-operator-744455d44c-g46bq\" (UID: \"35491756-4b45-4afa-a300-e8e55fa4b0c4\") " pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609013 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b634ffd-eba0-480b-8e1a-7e2f26582d07-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609036 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfxb\" (UniqueName: \"kubernetes.io/projected/1cb59275-0ea1-401b-b631-cf092f94b742-kube-api-access-znfxb\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609067 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jb5b\" (UniqueName: \"kubernetes.io/projected/71911446-2b7a-45cb-8dc5-277b629ee152-kube-api-access-9jb5b\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609119 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-encryption-config\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609144 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-policies\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609170 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-dir\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609195 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609237 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1dfb09-2907-4e18-a008-e74e9fc62482-config\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609268 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f587184-defe-4d9a-83a8-7cd95f215a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609291 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d05e33ff-25c2-4f7f-809c-7583945b4e7c-audit-dir\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609318 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcjf7\" (UniqueName: \"kubernetes.io/projected/ccde08de-7978-48ba-a59b-34a979ab7fa8-kube-api-access-mcjf7\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609356 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7f5\" (UniqueName: \"kubernetes.io/projected/4153ada5-4d96-41b3-805c-ab5464fc34ba-kube-api-access-rd7f5\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609422 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v45b\" (UniqueName: \"kubernetes.io/projected/c5edee35-4ff0-4da6-8db9-8cc26218d3f7-kube-api-access-8v45b\") pod \"cluster-samples-operator-665b6dd947-t4jzt\" (UID: \"c5edee35-4ff0-4da6-8db9-8cc26218d3f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609448 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/71911446-2b7a-45cb-8dc5-277b629ee152-machine-approver-tls\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609473 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa1dfb09-2907-4e18-a008-e74e9fc62482-trusted-ca\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609499 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8561646-dee0-44aa-a718-fdeaaf0ff34b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609539 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d05e33ff-25c2-4f7f-809c-7583945b4e7c-node-pullsecrets\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609570 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvtgq\" (UniqueName: \"kubernetes.io/projected/5d4532b3-caee-400d-992c-023a97f6d0ca-kube-api-access-jvtgq\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609627 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeba5175-b082-4db4-9db5-57be919585aa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609652 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b634ffd-eba0-480b-8e1a-7e2f26582d07-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609680 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609711 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpw42\" (UniqueName: \"kubernetes.io/projected/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-kube-api-access-qpw42\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609744 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4153ada5-4d96-41b3-805c-ab5464fc34ba-tmpfs\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609771 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609797 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-image-import-ca\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609826 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-etcd-client\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609852 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609876 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609902 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609933 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89f16e7c-79df-4d83-85bf-e714bbd768fc-service-ca-bundle\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.609980 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnzb\" (UniqueName: \"kubernetes.io/projected/d05e33ff-25c2-4f7f-809c-7583945b4e7c-kube-api-access-vwnzb\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610008 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flsmp\" (UniqueName: \"kubernetes.io/projected/fbca9013-a5ce-4d98-8552-503f5b2d8f45-kube-api-access-flsmp\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610033 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610059 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-client-ca\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610084 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f587184-defe-4d9a-83a8-7cd95f215a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610129 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4153ada5-4d96-41b3-805c-ab5464fc34ba-apiservice-cert\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610157 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610180 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610204 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71911446-2b7a-45cb-8dc5-277b629ee152-auth-proxy-config\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610231 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpdw\" (UniqueName: \"kubernetes.io/projected/594db65b-f9dd-4b9b-a572-e52e3d5a5875-kube-api-access-2hpdw\") pod \"multus-admission-controller-857f4d67dd-429s2\" (UID: \"594db65b-f9dd-4b9b-a572-e52e3d5a5875\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610267 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-config\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.610290 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeba5175-b082-4db4-9db5-57be919585aa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.613701 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.614309 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-audit-dir\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.614408 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-config\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.615650 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-config\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.615738 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeba5175-b082-4db4-9db5-57be919585aa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.616341 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbca9013-a5ce-4d98-8552-503f5b2d8f45-serving-cert\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.616387 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.616414 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-audit\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.617127 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-config\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.617505 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.617562 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d05e33ff-25c2-4f7f-809c-7583945b4e7c-audit-dir\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.618381 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-service-ca\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.618926 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.619380 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.624346 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.624471 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.624563 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.628286 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-client-ca\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.632267 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.632759 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.633198 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.633872 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/71911446-2b7a-45cb-8dc5-277b629ee152-machine-approver-tls\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.634062 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.634753 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vdp59"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.634938 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.634991 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.635430 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0337cb-0aed-41af-b587-a16392350413-serving-cert\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.635824 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-serving-cert\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.635989 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.636100 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71911446-2b7a-45cb-8dc5-277b629ee152-config\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.636425 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.636469 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-serving-cert\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.636728 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-oauth-serving-cert\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.636745 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d05e33ff-25c2-4f7f-809c-7583945b4e7c-node-pullsecrets\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.637137 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.638210 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.638799 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.638876 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-config\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.639796 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b634ffd-eba0-480b-8e1a-7e2f26582d07-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.640599 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeba5175-b082-4db4-9db5-57be919585aa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.643197 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.643741 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-trusted-ca-bundle\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.643963 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.644135 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-serving-cert\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.644321 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4153ada5-4d96-41b3-805c-ab5464fc34ba-tmpfs\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.644389 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.646304 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb59275-0ea1-401b-b631-cf092f94b742-serving-cert\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.647789 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-encryption-config\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.648560 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-serving-cert\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.649876 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-client-ca\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.650170 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.650513 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.650563 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-oauth-config\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.651029 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71911446-2b7a-45cb-8dc5-277b629ee152-auth-proxy-config\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.651261 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.651385 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-dir\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.651884 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b634ffd-eba0-480b-8e1a-7e2f26582d07-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.652509 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.655505 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-policies\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.656626 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-config\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.656923 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.657048 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.681997 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-encryption-config\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.682519 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4153ada5-4d96-41b3-805c-ab5464fc34ba-apiservice-cert\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.682918 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.683002 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5edee35-4ff0-4da6-8db9-8cc26218d3f7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-t4jzt\" (UID: \"c5edee35-4ff0-4da6-8db9-8cc26218d3f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.683141 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.657509 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.683420 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35491756-4b45-4afa-a300-e8e55fa4b0c4-metrics-tls\") pod \"dns-operator-744455d44c-g46bq\" (UID: \"35491756-4b45-4afa-a300-e8e55fa4b0c4\") " pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.683448 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4153ada5-4d96-41b3-805c-ab5464fc34ba-webhook-cert\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.683803 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.683815 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.688573 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.684103 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.688273 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.688946 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-etcd-client\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.688284 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d05e33ff-25c2-4f7f-809c-7583945b4e7c-image-import-ca\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.692361 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6qqbh"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.693664 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.697428 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5rcj7"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.698595 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d05e33ff-25c2-4f7f-809c-7583945b4e7c-etcd-client\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.700765 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wb62r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.702298 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.704523 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.705461 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.708128 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.709535 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.711318 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-default-certificate\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.711464 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxz72"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.711526 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1dfb09-2907-4e18-a008-e74e9fc62482-serving-cert\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.711562 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-config\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.711592 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9b90b292-7e22-4eb8-872f-d5552b334604-signing-cabundle\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.711948 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/594db65b-f9dd-4b9b-a572-e52e3d5a5875-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-429s2\" (UID: \"594db65b-f9dd-4b9b-a572-e52e3d5a5875\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712116 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9b90b292-7e22-4eb8-872f-d5552b334604-signing-key\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712166 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48xw\" (UniqueName: \"kubernetes.io/projected/9f587184-defe-4d9a-83a8-7cd95f215a55-kube-api-access-g48xw\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712189 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqdj\" (UniqueName: \"kubernetes.io/projected/89f16e7c-79df-4d83-85bf-e714bbd768fc-kube-api-access-qcqdj\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712220 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj247\" (UniqueName: \"kubernetes.io/projected/9b90b292-7e22-4eb8-872f-d5552b334604-kube-api-access-mj247\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712237 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-metrics-certs\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712254 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8p7r\" (UniqueName: \"kubernetes.io/projected/fa1dfb09-2907-4e18-a008-e74e9fc62482-kube-api-access-m8p7r\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712278 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f0bd703-4d46-4dfd-942d-3b554f736833-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712298 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8561646-dee0-44aa-a718-fdeaaf0ff34b-config\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712318 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccde08de-7978-48ba-a59b-34a979ab7fa8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712340 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccde08de-7978-48ba-a59b-34a979ab7fa8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712367 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712388 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t96gz\" (UniqueName: \"kubernetes.io/projected/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-kube-api-access-t96gz\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712404 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f0bd703-4d46-4dfd-942d-3b554f736833-srv-cert\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712474 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1dfb09-2907-4e18-a008-e74e9fc62482-config\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712493 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f587184-defe-4d9a-83a8-7cd95f215a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712524 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcjf7\" (UniqueName: \"kubernetes.io/projected/ccde08de-7978-48ba-a59b-34a979ab7fa8-kube-api-access-mcjf7\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712556 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa1dfb09-2907-4e18-a008-e74e9fc62482-trusted-ca\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712574 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8561646-dee0-44aa-a718-fdeaaf0ff34b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712600 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpw42\" (UniqueName: \"kubernetes.io/projected/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-kube-api-access-qpw42\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712643 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712672 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89f16e7c-79df-4d83-85bf-e714bbd768fc-service-ca-bundle\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712720 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f587184-defe-4d9a-83a8-7cd95f215a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712772 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpdw\" (UniqueName: \"kubernetes.io/projected/594db65b-f9dd-4b9b-a572-e52e3d5a5875-kube-api-access-2hpdw\") pod \"multus-admission-controller-857f4d67dd-429s2\" (UID: \"594db65b-f9dd-4b9b-a572-e52e3d5a5875\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712805 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712823 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr2t9\" (UniqueName: \"kubernetes.io/projected/3f0bd703-4d46-4dfd-942d-3b554f736833-kube-api-access-cr2t9\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712839 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8561646-dee0-44aa-a718-fdeaaf0ff34b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712858 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-stats-auth\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.712885 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-serving-cert\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.713184 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.714876 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8561646-dee0-44aa-a718-fdeaaf0ff34b-config\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.714921 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.715065 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.718084 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8561646-dee0-44aa-a718-fdeaaf0ff34b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.718733 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f587184-defe-4d9a-83a8-7cd95f215a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.719467 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.719717 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3f0bd703-4d46-4dfd-942d-3b554f736833-srv-cert\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.720270 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3f0bd703-4d46-4dfd-942d-3b554f736833-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.720489 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.722041 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.725020 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.726678 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4pt2m"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.729042 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g46bq"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.729216 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.731772 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6n78q"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.733253 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-stc5w"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.733968 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.734939 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f587184-defe-4d9a-83a8-7cd95f215a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.735010 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-j7s2s"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.735472 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.736148 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pcx4q"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.738866 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.740272 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xzmw2"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.741796 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-429s2"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.743223 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.744695 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.744711 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.746363 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.749655 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.751144 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.752674 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sqwt7"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.754244 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9tqb6"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.755825 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bj57p"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.757424 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.758951 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-45gzd"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.760389 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.761866 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.763332 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.763933 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.764749 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.766311 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.767825 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.769309 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pt2m"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.770767 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8wwmq"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.772319 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.772476 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.773828 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8wwmq"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.775309 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-stc5w"] Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.784806 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.804577 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.824243 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.845936 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.864767 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.884351 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.904809 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.924310 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.945581 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.974839 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 11:53:51 crc kubenswrapper[4880]: I0218 11:53:51.985119 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.005499 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.031344 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.044723 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.064707 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.084939 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.097340 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1dfb09-2907-4e18-a008-e74e9fc62482-serving-cert\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.104415 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.124713 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.145580 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.164932 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.184414 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.213463 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.215397 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa1dfb09-2907-4e18-a008-e74e9fc62482-trusted-ca\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.224160 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.234695 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1dfb09-2907-4e18-a008-e74e9fc62482-config\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.244171 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.264739 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.276501 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/594db65b-f9dd-4b9b-a572-e52e3d5a5875-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-429s2\" (UID: \"594db65b-f9dd-4b9b-a572-e52e3d5a5875\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.285644 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.304578 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.317707 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9b90b292-7e22-4eb8-872f-d5552b334604-signing-key\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.324318 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.333129 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9b90b292-7e22-4eb8-872f-d5552b334604-signing-cabundle\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.344376 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.365216 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.384905 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.404835 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.423753 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.450071 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.464994 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.488318 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.505265 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.513303 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-config\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.522644 4880 request.go:700] Waited for 1.007746408s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.524990 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.537050 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-serving-cert\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.545676 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.565265 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.583979 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.605203 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.624313 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.645518 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.664909 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.685275 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.695475 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-default-certificate\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.704634 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.713866 4880 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.713919 4880 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.713987 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ccde08de-7978-48ba-a59b-34a979ab7fa8-config podName:ccde08de-7978-48ba-a59b-34a979ab7fa8 nodeName:}" failed. No retries permitted until 2026-02-18 11:53:53.213960823 +0000 UTC m=+140.642861684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ccde08de-7978-48ba-a59b-34a979ab7fa8-config") pod "kube-storage-version-migrator-operator-b67b599dd-xhznq" (UID: "ccde08de-7978-48ba-a59b-34a979ab7fa8") : failed to sync configmap cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.714033 4880 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.714085 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccde08de-7978-48ba-a59b-34a979ab7fa8-serving-cert podName:ccde08de-7978-48ba-a59b-34a979ab7fa8 nodeName:}" failed. No retries permitted until 2026-02-18 11:53:53.214047026 +0000 UTC m=+140.642948077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ccde08de-7978-48ba-a59b-34a979ab7fa8-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-xhznq" (UID: "ccde08de-7978-48ba-a59b-34a979ab7fa8") : failed to sync secret cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.714196 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89f16e7c-79df-4d83-85bf-e714bbd768fc-service-ca-bundle podName:89f16e7c-79df-4d83-85bf-e714bbd768fc nodeName:}" failed. No retries permitted until 2026-02-18 11:53:53.214160499 +0000 UTC m=+140.643061580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/89f16e7c-79df-4d83-85bf-e714bbd768fc-service-ca-bundle") pod "router-default-5444994796-t9dmw" (UID: "89f16e7c-79df-4d83-85bf-e714bbd768fc") : failed to sync configmap cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.716974 4880 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: E0218 11:53:52.717075 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-metrics-certs podName:89f16e7c-79df-4d83-85bf-e714bbd768fc nodeName:}" failed. No retries permitted until 2026-02-18 11:53:53.21705246 +0000 UTC m=+140.645953321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-metrics-certs") pod "router-default-5444994796-t9dmw" (UID: "89f16e7c-79df-4d83-85bf-e714bbd768fc") : failed to sync secret cache: timed out waiting for the condition Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.717512 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-stats-auth\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.724289 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.744109 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.764133 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.784437 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.803774 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.825416 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.844902 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.864625 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.884979 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.924716 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.944635 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.965900 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 11:53:52 crc kubenswrapper[4880]: I0218 11:53:52.984345 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.004635 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.039698 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhngp\" (UniqueName: \"kubernetes.io/projected/0329fe9d-0f10-43a2-a74b-657ffe52fcf3-kube-api-access-lhngp\") pod \"apiserver-7bbb656c7d-sdkl4\" (UID: \"0329fe9d-0f10-43a2-a74b-657ffe52fcf3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.060971 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvngc\" (UniqueName: \"kubernetes.io/projected/7fc0ff29-742b-466a-84bb-4a7fa8bdb86d-kube-api-access-cvngc\") pod \"authentication-operator-69f744f599-5rcj7\" (UID: \"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.065914 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.085526 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.105098 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.140743 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7f5\" (UniqueName: \"kubernetes.io/projected/4153ada5-4d96-41b3-805c-ab5464fc34ba-kube-api-access-rd7f5\") pod \"packageserver-d55dfcdfc-lcr7n\" (UID: \"4153ada5-4d96-41b3-805c-ab5464fc34ba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.159528 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v45b\" (UniqueName: \"kubernetes.io/projected/c5edee35-4ff0-4da6-8db9-8cc26218d3f7-kube-api-access-8v45b\") pod \"cluster-samples-operator-665b6dd947-t4jzt\" (UID: \"c5edee35-4ff0-4da6-8db9-8cc26218d3f7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.200325 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4b7\" (UniqueName: \"kubernetes.io/projected/7f8c1f5b-ef29-4f12-88af-7387efdd41e6-kube-api-access-lw4b7\") pod \"machine-api-operator-5694c8668f-vdp59\" (UID: \"7f8c1f5b-ef29-4f12-88af-7387efdd41e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.206244 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.222116 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qscth\" (UniqueName: \"kubernetes.io/projected/9f0337cb-0aed-41af-b587-a16392350413-kube-api-access-qscth\") pod \"controller-manager-879f6c89f-sxz72\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.234255 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89f16e7c-79df-4d83-85bf-e714bbd768fc-service-ca-bundle\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.234423 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-metrics-certs\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.234453 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccde08de-7978-48ba-a59b-34a979ab7fa8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.234693 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccde08de-7978-48ba-a59b-34a979ab7fa8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.235710 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccde08de-7978-48ba-a59b-34a979ab7fa8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.236604 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89f16e7c-79df-4d83-85bf-e714bbd768fc-service-ca-bundle\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.236981 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.238864 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccde08de-7978-48ba-a59b-34a979ab7fa8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.238893 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f16e7c-79df-4d83-85bf-e714bbd768fc-metrics-certs\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.250540 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7ln\" (UniqueName: \"kubernetes.io/projected/eeba5175-b082-4db4-9db5-57be919585aa-kube-api-access-gt7ln\") pod \"openshift-controller-manager-operator-756b6f6bc6-hjc5f\" (UID: \"eeba5175-b082-4db4-9db5-57be919585aa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.265674 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.266252 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndqfb\" (UniqueName: \"kubernetes.io/projected/129989af-f3cc-44bf-9779-6688338d3130-kube-api-access-ndqfb\") pod \"downloads-7954f5f757-6n78q\" (UID: \"129989af-f3cc-44bf-9779-6688338d3130\") " pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.284710 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.290972 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.304900 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.307587 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.326463 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.335310 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.365155 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvtgq\" (UniqueName: \"kubernetes.io/projected/5d4532b3-caee-400d-992c-023a97f6d0ca-kube-api-access-jvtgq\") pod \"oauth-openshift-558db77b4-9tqb6\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.387104 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk7xx\" (UniqueName: \"kubernetes.io/projected/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-kube-api-access-tk7xx\") pod \"console-f9d7485db-6qqbh\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.400290 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.408746 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbts\" (UniqueName: \"kubernetes.io/projected/35491756-4b45-4afa-a300-e8e55fa4b0c4-kube-api-access-gfbts\") pod \"dns-operator-744455d44c-g46bq\" (UID: \"35491756-4b45-4afa-a300-e8e55fa4b0c4\") " pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.421707 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b634ffd-eba0-480b-8e1a-7e2f26582d07-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n4tsp\" (UID: \"6b634ffd-eba0-480b-8e1a-7e2f26582d07\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.446265 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfxb\" (UniqueName: \"kubernetes.io/projected/1cb59275-0ea1-401b-b631-cf092f94b742-kube-api-access-znfxb\") pod \"openshift-config-operator-7777fb866f-vdwwz\" (UID: \"1cb59275-0ea1-401b-b631-cf092f94b742\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.461308 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vdp59"] Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.463984 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jb5b\" (UniqueName: \"kubernetes.io/projected/71911446-2b7a-45cb-8dc5-277b629ee152-kube-api-access-9jb5b\") pod \"machine-approver-56656f9798-zwwmm\" (UID: \"71911446-2b7a-45cb-8dc5-277b629ee152\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:53 crc kubenswrapper[4880]: W0218 11:53:53.482192 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f8c1f5b_ef29_4f12_88af_7387efdd41e6.slice/crio-48b7420a4a649aa03aa63a46c694d7938064033a5b225735af2e8e242b9c5cea WatchSource:0}: Error finding container 48b7420a4a649aa03aa63a46c694d7938064033a5b225735af2e8e242b9c5cea: Status 404 returned error can't find the container with id 48b7420a4a649aa03aa63a46c694d7938064033a5b225735af2e8e242b9c5cea Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.483223 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnzb\" (UniqueName: \"kubernetes.io/projected/d05e33ff-25c2-4f7f-809c-7583945b4e7c-kube-api-access-vwnzb\") pod \"apiserver-76f77b778f-wb62r\" (UID: \"d05e33ff-25c2-4f7f-809c-7583945b4e7c\") " pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.485125 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.494174 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.501641 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flsmp\" (UniqueName: \"kubernetes.io/projected/fbca9013-a5ce-4d98-8552-503f5b2d8f45-kube-api-access-flsmp\") pod \"route-controller-manager-6576b87f9c-whp49\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.503872 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.512586 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.516697 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.522312 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.522908 4880 request.go:700] Waited for 1.809529568s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/serviceaccounts/service-ca/token Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.525967 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqdj\" (UniqueName: \"kubernetes.io/projected/89f16e7c-79df-4d83-85bf-e714bbd768fc-kube-api-access-qcqdj\") pod \"router-default-5444994796-t9dmw\" (UID: \"89f16e7c-79df-4d83-85bf-e714bbd768fc\") " pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.532362 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.546169 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj247\" (UniqueName: \"kubernetes.io/projected/9b90b292-7e22-4eb8-872f-d5552b334604-kube-api-access-mj247\") pod \"service-ca-9c57cc56f-45gzd\" (UID: \"9b90b292-7e22-4eb8-872f-d5552b334604\") " pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.565842 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t96gz\" (UniqueName: \"kubernetes.io/projected/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-kube-api-access-t96gz\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.566724 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.578113 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.585057 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5rcj7"] Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.590435 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8p7r\" (UniqueName: \"kubernetes.io/projected/fa1dfb09-2907-4e18-a008-e74e9fc62482-kube-api-access-m8p7r\") pod \"console-operator-58897d9998-sqwt7\" (UID: \"fa1dfb09-2907-4e18-a008-e74e9fc62482\") " pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.603897 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcjf7\" (UniqueName: \"kubernetes.io/projected/ccde08de-7978-48ba-a59b-34a979ab7fa8-kube-api-access-mcjf7\") pod \"kube-storage-version-migrator-operator-b67b599dd-xhznq\" (UID: \"ccde08de-7978-48ba-a59b-34a979ab7fa8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.620259 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.623370 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.629983 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.630666 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8561646-dee0-44aa-a718-fdeaaf0ff34b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ss7t9\" (UID: \"f8561646-dee0-44aa-a718-fdeaaf0ff34b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.651847 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpw42\" (UniqueName: \"kubernetes.io/projected/502f8aab-255f-468a-a21d-3c1fe4aaf8b2-kube-api-access-qpw42\") pod \"service-ca-operator-777779d784-dfz9j\" (UID: \"502f8aab-255f-468a-a21d-3c1fe4aaf8b2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.656539 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.668944 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48xw\" (UniqueName: \"kubernetes.io/projected/9f587184-defe-4d9a-83a8-7cd95f215a55-kube-api-access-g48xw\") pod \"openshift-apiserver-operator-796bbdcf4f-lrf5s\" (UID: \"9f587184-defe-4d9a-83a8-7cd95f215a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.674357 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.689901 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bc4b180-6c06-4d90-8eb7-b096bc72b76a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jqn5r\" (UID: \"7bc4b180-6c06-4d90-8eb7-b096bc72b76a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.690248 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.699366 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4"] Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.733790 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.734039 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpdw\" (UniqueName: \"kubernetes.io/projected/594db65b-f9dd-4b9b-a572-e52e3d5a5875-kube-api-access-2hpdw\") pod \"multus-admission-controller-857f4d67dd-429s2\" (UID: \"594db65b-f9dd-4b9b-a572-e52e3d5a5875\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.745331 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.756016 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr2t9\" (UniqueName: \"kubernetes.io/projected/3f0bd703-4d46-4dfd-942d-3b554f736833-kube-api-access-cr2t9\") pod \"olm-operator-6b444d44fb-6l7kh\" (UID: \"3f0bd703-4d46-4dfd-942d-3b554f736833\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.766367 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n"] Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.769856 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.780965 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6n78q"] Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.784811 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.807134 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.809782 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz"] Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.815733 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" event={"ID":"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d","Type":"ContainerStarted","Data":"df1b011190e540fee26adb5e00380446d4ebb1bd4d165a97d12489573c742907"} Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.818036 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" event={"ID":"0329fe9d-0f10-43a2-a74b-657ffe52fcf3","Type":"ContainerStarted","Data":"41aefa45898c6e4d8e4f663b44bb2ce2b45cce3a7f64d1f2d75d584bb4a17f96"} Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.820096 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" event={"ID":"7f8c1f5b-ef29-4f12-88af-7387efdd41e6","Type":"ContainerStarted","Data":"48b7420a4a649aa03aa63a46c694d7938064033a5b225735af2e8e242b9c5cea"} Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.824537 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.837529 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.841974 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.845491 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.854145 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.866365 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.868863 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.885152 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.904474 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.935815 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.936295 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.959176 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 11:53:53 crc kubenswrapper[4880]: I0218 11:53:53.978381 4880 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.058564 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e74b6ff4-3a1b-4340-9076-992e4a049e8e-proxy-tls\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.084958 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tkr2r\" (UID: \"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085148 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c284d25-2b25-48b6-a3b7-d4178e4db154-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085258 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-client\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085316 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-bound-sa-token\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085343 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/590f26fc-f2c5-47c7-96c1-217f8fb36f67-secret-volume\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085408 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nvd4\" (UniqueName: \"kubernetes.io/projected/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-kube-api-access-8nvd4\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085436 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c284d25-2b25-48b6-a3b7-d4178e4db154-metrics-tls\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085459 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c284d25-2b25-48b6-a3b7-d4178e4db154-trusted-ca\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085580 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-tls\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085974 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e74b6ff4-3a1b-4340-9076-992e4a049e8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.085996 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-trusted-ca\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086030 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/590f26fc-f2c5-47c7-96c1-217f8fb36f67-config-volume\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086069 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-serving-cert\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086084 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-ca\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086103 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk2k8\" (UniqueName: \"kubernetes.io/projected/e74b6ff4-3a1b-4340-9076-992e4a049e8e-kube-api-access-xk2k8\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086136 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f1515f-71f7-47dc-b429-1ce17401b9ea-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086152 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-config\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086173 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f1515f-71f7-47dc-b429-1ce17401b9ea-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086249 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-service-ca\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086324 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086428 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.086463 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrxh\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-kube-api-access-vzrxh\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.086918 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:54.586903949 +0000 UTC m=+142.015804810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087167 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psf94\" (UniqueName: \"kubernetes.io/projected/c34a3987-0ad6-4165-a6cf-717d47fea5fb-kube-api-access-psf94\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087468 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-certificates\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087503 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd98v\" (UniqueName: \"kubernetes.io/projected/0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a-kube-api-access-cd98v\") pod \"package-server-manager-789f6589d5-tkr2r\" (UID: \"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087535 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087558 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlsmr\" (UniqueName: \"kubernetes.io/projected/178d28de-f513-4c1f-bd35-886b84a1b892-kube-api-access-hlsmr\") pod \"control-plane-machine-set-operator-78cbb6b69f-cbjbp\" (UID: \"178d28de-f513-4c1f-bd35-886b84a1b892\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087632 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmc2z\" (UniqueName: \"kubernetes.io/projected/7c284d25-2b25-48b6-a3b7-d4178e4db154-kube-api-access-dmc2z\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087660 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/178d28de-f513-4c1f-bd35-886b84a1b892-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cbjbp\" (UID: \"178d28de-f513-4c1f-bd35-886b84a1b892\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.087701 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr66w\" (UniqueName: \"kubernetes.io/projected/590f26fc-f2c5-47c7-96c1-217f8fb36f67-kube-api-access-cr66w\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188398 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.188648 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:54.688583291 +0000 UTC m=+142.117484152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188761 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c284d25-2b25-48b6-a3b7-d4178e4db154-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188796 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-client\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188835 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-bound-sa-token\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188852 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/590f26fc-f2c5-47c7-96c1-217f8fb36f67-secret-volume\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188879 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-csi-data-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188902 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-srv-cert\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188926 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d45b4ded-00aa-4fcd-b166-59f997447d20-proxy-tls\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.188973 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f95zm\" (UniqueName: \"kubernetes.io/projected/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-kube-api-access-f95zm\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189019 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvd4\" (UniqueName: \"kubernetes.io/projected/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-kube-api-access-8nvd4\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189055 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c284d25-2b25-48b6-a3b7-d4178e4db154-metrics-tls\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189075 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c284d25-2b25-48b6-a3b7-d4178e4db154-trusted-ca\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189097 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-tls\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189116 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a20a14e8-19c7-4712-989c-cdf0b422b882-metrics-tls\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189270 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e74b6ff4-3a1b-4340-9076-992e4a049e8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189293 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd9lx\" (UniqueName: \"kubernetes.io/projected/d45b4ded-00aa-4fcd-b166-59f997447d20-kube-api-access-hd9lx\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189313 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-trusted-ca\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189368 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/590f26fc-f2c5-47c7-96c1-217f8fb36f67-config-volume\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189442 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-mountpoint-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189462 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45b4ded-00aa-4fcd-b166-59f997447d20-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189487 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-serving-cert\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189504 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-ca\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189536 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk2k8\" (UniqueName: \"kubernetes.io/projected/e74b6ff4-3a1b-4340-9076-992e4a049e8e-kube-api-access-xk2k8\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189584 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f1515f-71f7-47dc-b429-1ce17401b9ea-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189622 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-config\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189661 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-socket-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189677 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-plugins-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189699 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f1515f-71f7-47dc-b429-1ce17401b9ea-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189735 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189771 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-service-ca\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189864 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189886 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189903 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f54901d5-c293-48d0-bf46-0d14f035214b-certs\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189930 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-registration-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.189988 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrxh\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-kube-api-access-vzrxh\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190033 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqp8r\" (UniqueName: \"kubernetes.io/projected/4cab1e2b-abe4-4e53-ada6-d77879bcde5e-kube-api-access-gqp8r\") pod \"ingress-canary-stc5w\" (UID: \"4cab1e2b-abe4-4e53-ada6-d77879bcde5e\") " pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190075 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f54901d5-c293-48d0-bf46-0d14f035214b-node-bootstrap-token\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190110 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hbg6\" (UniqueName: \"kubernetes.io/projected/f54901d5-c293-48d0-bf46-0d14f035214b-kube-api-access-4hbg6\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190144 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psf94\" (UniqueName: \"kubernetes.io/projected/c34a3987-0ad6-4165-a6cf-717d47fea5fb-kube-api-access-psf94\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190185 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gpl\" (UniqueName: \"kubernetes.io/projected/a20a14e8-19c7-4712-989c-cdf0b422b882-kube-api-access-52gpl\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190219 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-certificates\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190255 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd98v\" (UniqueName: \"kubernetes.io/projected/0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a-kube-api-access-cd98v\") pod \"package-server-manager-789f6589d5-tkr2r\" (UID: \"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190274 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-config\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190294 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-profile-collector-cert\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190326 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190347 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlsmr\" (UniqueName: \"kubernetes.io/projected/178d28de-f513-4c1f-bd35-886b84a1b892-kube-api-access-hlsmr\") pod \"control-plane-machine-set-operator-78cbb6b69f-cbjbp\" (UID: \"178d28de-f513-4c1f-bd35-886b84a1b892\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190374 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20a14e8-19c7-4712-989c-cdf0b422b882-config-volume\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190392 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d45b4ded-00aa-4fcd-b166-59f997447d20-images\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190448 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmc2z\" (UniqueName: \"kubernetes.io/projected/7c284d25-2b25-48b6-a3b7-d4178e4db154-kube-api-access-dmc2z\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190466 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/178d28de-f513-4c1f-bd35-886b84a1b892-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cbjbp\" (UID: \"178d28de-f513-4c1f-bd35-886b84a1b892\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190486 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr66w\" (UniqueName: \"kubernetes.io/projected/590f26fc-f2c5-47c7-96c1-217f8fb36f67-kube-api-access-cr66w\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190519 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190538 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwhn\" (UniqueName: \"kubernetes.io/projected/997abc8f-4776-4edd-9247-5bd0d51823f2-kube-api-access-6xwhn\") pod \"migrator-59844c95c7-b4gvz\" (UID: \"997abc8f-4776-4edd-9247-5bd0d51823f2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190599 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e74b6ff4-3a1b-4340-9076-992e4a049e8e-proxy-tls\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190694 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tkr2r\" (UID: \"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.190727 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtf5n\" (UniqueName: \"kubernetes.io/projected/2b7f342a-8424-4d37-96d3-de3851c1de4f-kube-api-access-xtf5n\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.191154 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-trusted-ca\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.194178 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c284d25-2b25-48b6-a3b7-d4178e4db154-trusted-ca\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.194394 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f1515f-71f7-47dc-b429-1ce17401b9ea-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.194450 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e74b6ff4-3a1b-4340-9076-992e4a049e8e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.194725 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cab1e2b-abe4-4e53-ada6-d77879bcde5e-cert\") pod \"ingress-canary-stc5w\" (UID: \"4cab1e2b-abe4-4e53-ada6-d77879bcde5e\") " pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.195485 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:54.695462895 +0000 UTC m=+142.124363756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.195507 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-service-ca\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.195710 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-ca\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.198044 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c284d25-2b25-48b6-a3b7-d4178e4db154-metrics-tls\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.199257 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-config\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.199449 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tkr2r\" (UID: \"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.202439 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-certificates\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.206325 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-tls\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.207074 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/178d28de-f513-4c1f-bd35-886b84a1b892-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-cbjbp\" (UID: \"178d28de-f513-4c1f-bd35-886b84a1b892\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.208292 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-etcd-client\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.210224 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.212204 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-serving-cert\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.213373 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f1515f-71f7-47dc-b429-1ce17401b9ea-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.214412 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e74b6ff4-3a1b-4340-9076-992e4a049e8e-proxy-tls\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.215010 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.225802 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nvd4\" (UniqueName: \"kubernetes.io/projected/12162d78-4cd9-4850-b9b6-d9a5bd9bef0d-kube-api-access-8nvd4\") pod \"etcd-operator-b45778765-pcx4q\" (UID: \"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.258154 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c284d25-2b25-48b6-a3b7-d4178e4db154-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.267021 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-bound-sa-token\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.269941 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.271277 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9tqb6"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.276479 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/590f26fc-f2c5-47c7-96c1-217f8fb36f67-secret-volume\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.285722 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.287258 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g46bq"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.300755 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.300991 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f54901d5-c293-48d0-bf46-0d14f035214b-certs\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301020 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-registration-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301060 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqp8r\" (UniqueName: \"kubernetes.io/projected/4cab1e2b-abe4-4e53-ada6-d77879bcde5e-kube-api-access-gqp8r\") pod \"ingress-canary-stc5w\" (UID: \"4cab1e2b-abe4-4e53-ada6-d77879bcde5e\") " pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301079 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f54901d5-c293-48d0-bf46-0d14f035214b-node-bootstrap-token\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301099 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hbg6\" (UniqueName: \"kubernetes.io/projected/f54901d5-c293-48d0-bf46-0d14f035214b-kube-api-access-4hbg6\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301129 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gpl\" (UniqueName: \"kubernetes.io/projected/a20a14e8-19c7-4712-989c-cdf0b422b882-kube-api-access-52gpl\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301158 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-config\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301173 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-profile-collector-cert\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301200 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20a14e8-19c7-4712-989c-cdf0b422b882-config-volume\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301217 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d45b4ded-00aa-4fcd-b166-59f997447d20-images\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301245 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwhn\" (UniqueName: \"kubernetes.io/projected/997abc8f-4776-4edd-9247-5bd0d51823f2-kube-api-access-6xwhn\") pod \"migrator-59844c95c7-b4gvz\" (UID: \"997abc8f-4776-4edd-9247-5bd0d51823f2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301268 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301314 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cab1e2b-abe4-4e53-ada6-d77879bcde5e-cert\") pod \"ingress-canary-stc5w\" (UID: \"4cab1e2b-abe4-4e53-ada6-d77879bcde5e\") " pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301338 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtf5n\" (UniqueName: \"kubernetes.io/projected/2b7f342a-8424-4d37-96d3-de3851c1de4f-kube-api-access-xtf5n\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301374 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-csi-data-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301397 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d45b4ded-00aa-4fcd-b166-59f997447d20-proxy-tls\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301416 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-srv-cert\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301434 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f95zm\" (UniqueName: \"kubernetes.io/projected/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-kube-api-access-f95zm\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301457 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a20a14e8-19c7-4712-989c-cdf0b422b882-metrics-tls\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301484 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd9lx\" (UniqueName: \"kubernetes.io/projected/d45b4ded-00aa-4fcd-b166-59f997447d20-kube-api-access-hd9lx\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301523 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-mountpoint-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301546 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45b4ded-00aa-4fcd-b166-59f997447d20-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301577 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-socket-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301593 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-plugins-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.301633 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.301910 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:54.801885952 +0000 UTC m=+142.230786813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: W0218 11:53:54.302556 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeba5175_b082_4db4_9db5_57be919585aa.slice/crio-7c7399650c0cd905d5c47ac1197ab5ff9547021ac29f8040b5df0478dc09cd06 WatchSource:0}: Error finding container 7c7399650c0cd905d5c47ac1197ab5ff9547021ac29f8040b5df0478dc09cd06: Status 404 returned error can't find the container with id 7c7399650c0cd905d5c47ac1197ab5ff9547021ac29f8040b5df0478dc09cd06 Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.306972 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f54901d5-c293-48d0-bf46-0d14f035214b-certs\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.307117 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-mountpoint-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.307684 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45b4ded-00aa-4fcd-b166-59f997447d20-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.309067 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-socket-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.309117 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-plugins-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.309280 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlsmr\" (UniqueName: \"kubernetes.io/projected/178d28de-f513-4c1f-bd35-886b84a1b892-kube-api-access-hlsmr\") pod \"control-plane-machine-set-operator-78cbb6b69f-cbjbp\" (UID: \"178d28de-f513-4c1f-bd35-886b84a1b892\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.309522 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-registration-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.309963 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f54901d5-c293-48d0-bf46-0d14f035214b-node-bootstrap-token\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.310072 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d45b4ded-00aa-4fcd-b166-59f997447d20-images\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.312161 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cab1e2b-abe4-4e53-ada6-d77879bcde5e-cert\") pod \"ingress-canary-stc5w\" (UID: \"4cab1e2b-abe4-4e53-ada6-d77879bcde5e\") " pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.313278 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20a14e8-19c7-4712-989c-cdf0b422b882-config-volume\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.313441 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b7f342a-8424-4d37-96d3-de3851c1de4f-csi-data-dir\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.314106 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.314975 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-profile-collector-cert\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.316089 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a20a14e8-19c7-4712-989c-cdf0b422b882-metrics-tls\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.319791 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-srv-cert\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.322045 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.322562 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d45b4ded-00aa-4fcd-b166-59f997447d20-proxy-tls\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.322926 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrxh\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-kube-api-access-vzrxh\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.332724 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-config\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.334360 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/590f26fc-f2c5-47c7-96c1-217f8fb36f67-config-volume\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: W0218 11:53:54.340458 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4532b3_caee_400d_992c_023a97f6d0ca.slice/crio-7e6704454cec8fadb22b128e07d5d9b74d2e54888a8274a3ae30f1fae24dd02b WatchSource:0}: Error finding container 7e6704454cec8fadb22b128e07d5d9b74d2e54888a8274a3ae30f1fae24dd02b: Status 404 returned error can't find the container with id 7e6704454cec8fadb22b128e07d5d9b74d2e54888a8274a3ae30f1fae24dd02b Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.349988 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk2k8\" (UniqueName: \"kubernetes.io/projected/e74b6ff4-3a1b-4340-9076-992e4a049e8e-kube-api-access-xk2k8\") pod \"machine-config-controller-84d6567774-9ql7r\" (UID: \"e74b6ff4-3a1b-4340-9076-992e4a049e8e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.371301 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psf94\" (UniqueName: \"kubernetes.io/projected/c34a3987-0ad6-4165-a6cf-717d47fea5fb-kube-api-access-psf94\") pod \"marketplace-operator-79b997595-xzmw2\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.387870 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr66w\" (UniqueName: \"kubernetes.io/projected/590f26fc-f2c5-47c7-96c1-217f8fb36f67-kube-api-access-cr66w\") pod \"collect-profiles-29523585-rvbgw\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.402387 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.403322 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:54.903310747 +0000 UTC m=+142.332211608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.414074 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6qqbh"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.421458 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd98v\" (UniqueName: \"kubernetes.io/projected/0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a-kube-api-access-cd98v\") pod \"package-server-manager-789f6589d5-tkr2r\" (UID: \"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.429637 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmc2z\" (UniqueName: \"kubernetes.io/projected/7c284d25-2b25-48b6-a3b7-d4178e4db154-kube-api-access-dmc2z\") pod \"ingress-operator-5b745b69d9-cdwn2\" (UID: \"7c284d25-2b25-48b6-a3b7-d4178e4db154\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.481338 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.483314 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcb7870f-01f8-423b-ac4c-5e2cae04ca25-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4nq8f\" (UID: \"bcb7870f-01f8-423b-ac4c-5e2cae04ca25\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.485249 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.496162 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd9lx\" (UniqueName: \"kubernetes.io/projected/d45b4ded-00aa-4fcd-b166-59f997447d20-kube-api-access-hd9lx\") pod \"machine-config-operator-74547568cd-xgvx6\" (UID: \"d45b4ded-00aa-4fcd-b166-59f997447d20\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.496714 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.507929 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.508369 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.008337224 +0000 UTC m=+142.437238085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.508825 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.509437 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.517416 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gpl\" (UniqueName: \"kubernetes.io/projected/a20a14e8-19c7-4712-989c-cdf0b422b882-kube-api-access-52gpl\") pod \"dns-default-4pt2m\" (UID: \"a20a14e8-19c7-4712-989c-cdf0b422b882\") " pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.517806 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.017792482 +0000 UTC m=+142.446693343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.524886 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxz72"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.536580 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.544742 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtf5n\" (UniqueName: \"kubernetes.io/projected/2b7f342a-8424-4d37-96d3-de3851c1de4f-kube-api-access-xtf5n\") pod \"csi-hostpathplugin-8wwmq\" (UID: \"2b7f342a-8424-4d37-96d3-de3851c1de4f\") " pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.552784 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-45gzd"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.554327 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.560784 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.565881 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.584412 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sqwt7"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.584471 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wb62r"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.584487 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.587806 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.590754 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.593784 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqp8r\" (UniqueName: \"kubernetes.io/projected/4cab1e2b-abe4-4e53-ada6-d77879bcde5e-kube-api-access-gqp8r\") pod \"ingress-canary-stc5w\" (UID: \"4cab1e2b-abe4-4e53-ada6-d77879bcde5e\") " pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.596794 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.601791 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwhn\" (UniqueName: \"kubernetes.io/projected/997abc8f-4776-4edd-9247-5bd0d51823f2-kube-api-access-6xwhn\") pod \"migrator-59844c95c7-b4gvz\" (UID: \"997abc8f-4776-4edd-9247-5bd0d51823f2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.601939 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.602018 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hbg6\" (UniqueName: \"kubernetes.io/projected/f54901d5-c293-48d0-bf46-0d14f035214b-kube-api-access-4hbg6\") pod \"machine-config-server-j7s2s\" (UID: \"f54901d5-c293-48d0-bf46-0d14f035214b\") " pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.608764 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.610787 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.611062 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.111029155 +0000 UTC m=+142.539930016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.611243 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.611803 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.111780457 +0000 UTC m=+142.540681318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.622925 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f95zm\" (UniqueName: \"kubernetes.io/projected/644e6f4e-dcff-4163-92f5-87d0d42bf9e0-kube-api-access-f95zm\") pod \"catalog-operator-68c6474976-llwhx\" (UID: \"644e6f4e-dcff-4163-92f5-87d0d42bf9e0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.631286 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.634024 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.639381 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.644650 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-429s2"] Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.649876 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-stc5w" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.665546 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j"] Feb 18 11:53:54 crc kubenswrapper[4880]: W0218 11:53:54.669199 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccde08de_7978_48ba_a59b_34a979ab7fa8.slice/crio-39fb71f669b15bdba854d98cc1113b2ae813d1de0e69ef21c1acf3f3d1f5cf97 WatchSource:0}: Error finding container 39fb71f669b15bdba854d98cc1113b2ae813d1de0e69ef21c1acf3f3d1f5cf97: Status 404 returned error can't find the container with id 39fb71f669b15bdba854d98cc1113b2ae813d1de0e69ef21c1acf3f3d1f5cf97 Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.677245 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-j7s2s" Feb 18 11:53:54 crc kubenswrapper[4880]: W0218 11:53:54.698711 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd05e33ff_25c2_4f7f_809c_7583945b4e7c.slice/crio-2deacb5b653a2f1640258f1d9c05b7a647cd63a5cbb18166544bb6c0554cfa34 WatchSource:0}: Error finding container 2deacb5b653a2f1640258f1d9c05b7a647cd63a5cbb18166544bb6c0554cfa34: Status 404 returned error can't find the container with id 2deacb5b653a2f1640258f1d9c05b7a647cd63a5cbb18166544bb6c0554cfa34 Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.698841 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" Feb 18 11:53:54 crc kubenswrapper[4880]: W0218 11:53:54.701184 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f0bd703_4d46_4dfd_942d_3b554f736833.slice/crio-a6cd6a963f320848923b04bfe41630ed68613f931ccfaadc838443ef0d3444fb WatchSource:0}: Error finding container a6cd6a963f320848923b04bfe41630ed68613f931ccfaadc838443ef0d3444fb: Status 404 returned error can't find the container with id a6cd6a963f320848923b04bfe41630ed68613f931ccfaadc838443ef0d3444fb Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.712810 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.713347 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.213330305 +0000 UTC m=+142.642231166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.713388 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.713752 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.213744977 +0000 UTC m=+142.642645838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.814355 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.814554 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.314521493 +0000 UTC m=+142.743422364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.814642 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.814958 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.314947406 +0000 UTC m=+142.743848267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.833810 4880 generic.go:334] "Generic (PLEG): container finished" podID="1cb59275-0ea1-401b-b631-cf092f94b742" containerID="2b4832c098a6d80fb1f7c0ad28f73e9ccca1d108cd79e2ecea6c612de086d345" exitCode=0 Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.833882 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" event={"ID":"1cb59275-0ea1-401b-b631-cf092f94b742","Type":"ContainerDied","Data":"2b4832c098a6d80fb1f7c0ad28f73e9ccca1d108cd79e2ecea6c612de086d345"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.833908 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" event={"ID":"1cb59275-0ea1-401b-b631-cf092f94b742","Type":"ContainerStarted","Data":"69ced81ebc94c63e95ae2a7013cf74ab4c7ed4bf6a290606381145b21f458b6e"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.836120 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" event={"ID":"4153ada5-4d96-41b3-805c-ab5464fc34ba","Type":"ContainerStarted","Data":"1df8dc4a45409a9c8b6796ee2232f7437b8a0f5c3a432ecec3c69f8f2ff5fcfc"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.837410 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" event={"ID":"4153ada5-4d96-41b3-805c-ab5464fc34ba","Type":"ContainerStarted","Data":"28f5233d42b4c4e0173f7a0ffa393cea886392967a8d7ae7d79b87b18c3c61eb"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.838521 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.841023 4880 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lcr7n container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" start-of-body= Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.841066 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" podUID="4153ada5-4d96-41b3-805c-ab5464fc34ba" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.15:5443/healthz\": dial tcp 10.217.0.15:5443: connect: connection refused" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.841519 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t9dmw" event={"ID":"89f16e7c-79df-4d83-85bf-e714bbd768fc","Type":"ContainerStarted","Data":"88a3dc51c4501c204b7f5f542a95b39bae9684351f9abead5e8263308726a042"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.841549 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t9dmw" event={"ID":"89f16e7c-79df-4d83-85bf-e714bbd768fc","Type":"ContainerStarted","Data":"e8c945a3418f0fe708ebbc4528ccd266ae0f804abe4c63a192956d30a910283d"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.845033 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" event={"ID":"5d4532b3-caee-400d-992c-023a97f6d0ca","Type":"ContainerStarted","Data":"7e6704454cec8fadb22b128e07d5d9b74d2e54888a8274a3ae30f1fae24dd02b"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.853133 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" event={"ID":"35491756-4b45-4afa-a300-e8e55fa4b0c4","Type":"ContainerStarted","Data":"b48dfc23cf398289079e404b062421a18f5241d5e42539c56e34c40b514ab111"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.855696 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" event={"ID":"9b90b292-7e22-4eb8-872f-d5552b334604","Type":"ContainerStarted","Data":"3abe763500ed570d56400918b7383c487c4f82a3de56e828aef7261612df6120"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.859064 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6n78q" event={"ID":"129989af-f3cc-44bf-9779-6688338d3130","Type":"ContainerStarted","Data":"c51384ba3fd1325809de037e607e8f0a61c882e0bc397bc3a80b22ca6ab5ea08"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.859159 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6n78q" event={"ID":"129989af-f3cc-44bf-9779-6688338d3130","Type":"ContainerStarted","Data":"00a325c5e2a0589b982ddef52e071e9c0198e3d1b805f8cbb7ab4d880bbc4706"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.859199 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.862079 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" event={"ID":"71911446-2b7a-45cb-8dc5-277b629ee152","Type":"ContainerStarted","Data":"a3905d294f9fbf3d2dcacf9448d7e01d4ff5f7ea08689d087da3f47dd4a147c0"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.865407 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" event={"ID":"6b634ffd-eba0-480b-8e1a-7e2f26582d07","Type":"ContainerStarted","Data":"876b1ba005ccdb5169e831a7ac0d28a8938c1815bc9f80ad2709925a442a1825"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.865775 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.865819 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.870078 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" event={"ID":"594db65b-f9dd-4b9b-a572-e52e3d5a5875","Type":"ContainerStarted","Data":"bcb4bb3e27d2968e4a5367307f2add8cd92f28b6f91f87cbdcae468a04e17fb0"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.874028 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" event={"ID":"9f0337cb-0aed-41af-b587-a16392350413","Type":"ContainerStarted","Data":"09f3b432b886cc050f49fb7ad116e7a59fe9fd6c3477aad4ded888ac13f47d35"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.880493 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" event={"ID":"7fc0ff29-742b-466a-84bb-4a7fa8bdb86d","Type":"ContainerStarted","Data":"3c7dc2e621a5dc433d05c0167f4af81496ffa08e96526b3cc7118f1725bd92fe"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.883080 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" event={"ID":"f8561646-dee0-44aa-a718-fdeaaf0ff34b","Type":"ContainerStarted","Data":"085015c3d24f94553fb0109d37d998aa496cb2723c8b86425b64742fe6b0203b"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.887786 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" event={"ID":"502f8aab-255f-468a-a21d-3c1fe4aaf8b2","Type":"ContainerStarted","Data":"bafda2b6e96e90bef195bff2b4ca33361475c4d5a3fc1a9a638e3662e55a7991"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.895489 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" event={"ID":"7bc4b180-6c06-4d90-8eb7-b096bc72b76a","Type":"ContainerStarted","Data":"10916a8e63e81da9ac5b39c661adb4440f59e9a65f37a92797a67bd87f2dbaec"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.915771 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:54 crc kubenswrapper[4880]: E0218 11:53:54.916174 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.416129504 +0000 UTC m=+142.845030365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.920771 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.934546 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" event={"ID":"ccde08de-7978-48ba-a59b-34a979ab7fa8","Type":"ContainerStarted","Data":"39fb71f669b15bdba854d98cc1113b2ae813d1de0e69ef21c1acf3f3d1f5cf97"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.936776 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" event={"ID":"7f8c1f5b-ef29-4f12-88af-7387efdd41e6","Type":"ContainerStarted","Data":"de5d1f6c55ce4a0ca22c375930435e06d045e03b5b8b9eb1f1988d585b75b5cc"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.945222 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" event={"ID":"3f0bd703-4d46-4dfd-942d-3b554f736833","Type":"ContainerStarted","Data":"a6cd6a963f320848923b04bfe41630ed68613f931ccfaadc838443ef0d3444fb"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.950869 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" event={"ID":"eeba5175-b082-4db4-9db5-57be919585aa","Type":"ContainerStarted","Data":"7c7399650c0cd905d5c47ac1197ab5ff9547021ac29f8040b5df0478dc09cd06"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.951874 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" event={"ID":"9f587184-defe-4d9a-83a8-7cd95f215a55","Type":"ContainerStarted","Data":"1ab2b05128a3605db898b69cde5c619b527277f935f4c81c05ed3a08db7cf386"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.956372 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6qqbh" event={"ID":"0032e4f7-d08f-4fe5-890c-f02eb48a7f86","Type":"ContainerStarted","Data":"73fd2c212dc4ee03146390754fc50d55e22b0525879aa567f761282a40c83869"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.961602 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" event={"ID":"c5edee35-4ff0-4da6-8db9-8cc26218d3f7","Type":"ContainerStarted","Data":"bc5e5e40bfdaf5cb4498848e71361f99f0dd4b35b83425eeb0337bae92ba8b80"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.970290 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" event={"ID":"fbca9013-a5ce-4d98-8552-503f5b2d8f45","Type":"ContainerStarted","Data":"1a2cfcc39ecc2621153ce25906d4dbc2480e0bbfaeb4460193173e1576002885"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.975759 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" event={"ID":"d05e33ff-25c2-4f7f-809c-7583945b4e7c","Type":"ContainerStarted","Data":"2deacb5b653a2f1640258f1d9c05b7a647cd63a5cbb18166544bb6c0554cfa34"} Feb 18 11:53:54 crc kubenswrapper[4880]: I0218 11:53:54.979571 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" event={"ID":"fa1dfb09-2907-4e18-a008-e74e9fc62482","Type":"ContainerStarted","Data":"d7d28f3a40597f42860a1aeeb763ed1ecf1aa77ad56854738046831ab003ec6a"} Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.021123 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.022702 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.522673314 +0000 UTC m=+142.951574175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.031833 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2"] Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.123665 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.123980 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.623928174 +0000 UTC m=+143.052829035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.124278 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.124748 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.624737967 +0000 UTC m=+143.053638828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.219650 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r"] Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.226262 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.226640 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.726574404 +0000 UTC m=+143.155475265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.227682 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.228234 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.728219661 +0000 UTC m=+143.157120522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.329171 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.329949 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.829930794 +0000 UTC m=+143.258831655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.437452 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz"] Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.440306 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.440826 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:55.940802506 +0000 UTC m=+143.369703547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.542631 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.543233 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.043209889 +0000 UTC m=+143.472110750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.644936 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.645392 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.145377305 +0000 UTC m=+143.574278166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.675637 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.747406 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.748083 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.248040215 +0000 UTC m=+143.676941076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.784826 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6n78q" podStartSLOduration=123.784809244 podStartE2EDuration="2m3.784809244s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:55.783005753 +0000 UTC m=+143.211906624" watchObservedRunningTime="2026-02-18 11:53:55.784809244 +0000 UTC m=+143.213710105" Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.850391 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.851074 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.351049826 +0000 UTC m=+143.779950687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.871198 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:53:55 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:53:55 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:53:55 crc kubenswrapper[4880]: healthz check failed Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.871241 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.934807 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" podStartSLOduration=122.934765361 podStartE2EDuration="2m2.934765361s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:55.914101337 +0000 UTC m=+143.343002218" watchObservedRunningTime="2026-02-18 11:53:55.934765361 +0000 UTC m=+143.363666222" Feb 18 11:53:55 crc kubenswrapper[4880]: I0218 11:53:55.951943 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:55 crc kubenswrapper[4880]: E0218 11:53:55.952338 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.452322106 +0000 UTC m=+143.881222967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.055236 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.056358 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.556324734 +0000 UTC m=+143.985225605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.063112 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5rcj7" podStartSLOduration=124.063091855 podStartE2EDuration="2m4.063091855s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.03952839 +0000 UTC m=+143.468429261" watchObservedRunningTime="2026-02-18 11:53:56.063091855 +0000 UTC m=+143.491992716" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.063184 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" event={"ID":"c5edee35-4ff0-4da6-8db9-8cc26218d3f7","Type":"ContainerStarted","Data":"bc6ca65d5d8f2a0022e0892cd80edbf160d8989e6b72e97b7e7cf9993d8a5550"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.077012 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" event={"ID":"fbca9013-a5ce-4d98-8552-503f5b2d8f45","Type":"ContainerStarted","Data":"b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.077623 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.079283 4880 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-whp49 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.080837 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" podUID="fbca9013-a5ce-4d98-8552-503f5b2d8f45" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.117818 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" event={"ID":"7f8c1f5b-ef29-4f12-88af-7387efdd41e6","Type":"ContainerStarted","Data":"090c0a54b95f2e40a7141a47819c9928d6dca747206896364126feecc251fdcb"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.137657 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-stc5w"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.139493 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" event={"ID":"9f0337cb-0aed-41af-b587-a16392350413","Type":"ContainerStarted","Data":"54cb11ceecfa24c056963ad9ae9c6b06764f799f7bcdb9e4b2c7ff211397af6f"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.140174 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.144869 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" event={"ID":"35491756-4b45-4afa-a300-e8e55fa4b0c4","Type":"ContainerStarted","Data":"2e659ea812b1bbf09645e4a4fda4e77f6a6ef0149b623f868af5f32adeeea1a1"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.150780 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" event={"ID":"71911446-2b7a-45cb-8dc5-277b629ee152","Type":"ContainerStarted","Data":"6579ef3d475d51c78b4d29e10fda5974b88f0a158eb828b8c7390a640384503b"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.156886 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.158258 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.658240273 +0000 UTC m=+144.087141134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.165768 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" event={"ID":"5d4532b3-caee-400d-992c-023a97f6d0ca","Type":"ContainerStarted","Data":"d1a6f2b5da434105602e8bffd1703b53607cf8e08b155728fa79f3a52f135ab3"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.166147 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.180636 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" event={"ID":"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a","Type":"ContainerStarted","Data":"8846fa76099b935e22741d969310364064b5b01d4f3ad7055a5f381302d234e0"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.202250 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" event={"ID":"6b634ffd-eba0-480b-8e1a-7e2f26582d07","Type":"ContainerStarted","Data":"86cd81424ec876eb155e624adf7cb3a2c38f2e669520c2d9a65cb5441d0f749b"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.211134 4880 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sxz72 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.211198 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" podUID="9f0337cb-0aed-41af-b587-a16392350413" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.229031 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" event={"ID":"7c284d25-2b25-48b6-a3b7-d4178e4db154","Type":"ContainerStarted","Data":"d9a579f73aac893f0907ae33591c3204763165b5c761b6d65eb59101765e6983"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.251374 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" event={"ID":"3f0bd703-4d46-4dfd-942d-3b554f736833","Type":"ContainerStarted","Data":"2aad3ac7cb4df90038c15f6d4326b9d0ca3d6f23079c2194396f386fd9565b6e"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.251988 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.262853 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.269099 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.769069454 +0000 UTC m=+144.197970315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.272357 4880 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6l7kh container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.272427 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" podUID="3f0bd703-4d46-4dfd-942d-3b554f736833" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.280375 4880 generic.go:334] "Generic (PLEG): container finished" podID="0329fe9d-0f10-43a2-a74b-657ffe52fcf3" containerID="84f1522c57c71e474bcb1febb17dce1e43d0b72876e6b9879fdeb59d8a9c6fb4" exitCode=0 Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.280480 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" event={"ID":"0329fe9d-0f10-43a2-a74b-657ffe52fcf3","Type":"ContainerDied","Data":"84f1522c57c71e474bcb1febb17dce1e43d0b72876e6b9879fdeb59d8a9c6fb4"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.283908 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xzmw2"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.300739 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-j7s2s" event={"ID":"f54901d5-c293-48d0-bf46-0d14f035214b","Type":"ContainerStarted","Data":"014be239f9eb420905e5a1a654815e8f077d62cb110f81e61dad3262d8d2f325"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.329846 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6qqbh" event={"ID":"0032e4f7-d08f-4fe5-890c-f02eb48a7f86","Type":"ContainerStarted","Data":"9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.344964 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" event={"ID":"9b90b292-7e22-4eb8-872f-d5552b334604","Type":"ContainerStarted","Data":"10f83b050e950aab681f6923e0ba88a10b5ec9d6d4183f9fede6f8fbf7557c9a"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.347999 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" event={"ID":"997abc8f-4776-4edd-9247-5bd0d51823f2","Type":"ContainerStarted","Data":"f4a9976001ee3129b18ac895b503edd68cd63448dc3a3dc0e3cee00147896c38"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.352113 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" event={"ID":"eeba5175-b082-4db4-9db5-57be919585aa","Type":"ContainerStarted","Data":"d1158f7091de55093bb95e873f954b69cbda8ace5dd5209cff8ed6224241d6c9"} Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.353651 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.353737 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.359565 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-t9dmw" podStartSLOduration=123.35953523 podStartE2EDuration="2m3.35953523s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.335122401 +0000 UTC m=+143.764023272" watchObservedRunningTime="2026-02-18 11:53:56.35953523 +0000 UTC m=+143.788436091" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.371798 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lcr7n" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.372537 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.374280 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.874218164 +0000 UTC m=+144.303119035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.380594 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.381013 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.880995116 +0000 UTC m=+144.309895977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.469079 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-45gzd" podStartSLOduration=123.469055764 podStartE2EDuration="2m3.469055764s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.462419146 +0000 UTC m=+143.891320007" watchObservedRunningTime="2026-02-18 11:53:56.469055764 +0000 UTC m=+143.897956615" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.483005 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.484480 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:56.984430648 +0000 UTC m=+144.413331509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.521206 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" podStartSLOduration=123.521186776 podStartE2EDuration="2m3.521186776s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.518948074 +0000 UTC m=+143.947848935" watchObservedRunningTime="2026-02-18 11:53:56.521186776 +0000 UTC m=+143.950087637" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.587566 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.588089 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.088073236 +0000 UTC m=+144.516974097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.615164 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.681480 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" podStartSLOduration=123.681440124 podStartE2EDuration="2m3.681440124s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.651806967 +0000 UTC m=+144.080707838" watchObservedRunningTime="2026-02-18 11:53:56.681440124 +0000 UTC m=+144.110340985" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.690439 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.690927 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.190871851 +0000 UTC m=+144.619772712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.706429 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:53:56 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:53:56 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:53:56 crc kubenswrapper[4880]: healthz check failed Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.706576 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.710725 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" podStartSLOduration=123.710706981 podStartE2EDuration="2m3.710706981s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.70611761 +0000 UTC m=+144.135018471" watchObservedRunningTime="2026-02-18 11:53:56.710706981 +0000 UTC m=+144.139607852" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.755081 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pt2m"] Feb 18 11:53:56 crc kubenswrapper[4880]: W0218 11:53:56.765664 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod590f26fc_f2c5_47c7_96c1_217f8fb36f67.slice/crio-8340a741ed724ecc030666cb5d9826dabaf049806fdee1c8de06099ef42cc3d2 WatchSource:0}: Error finding container 8340a741ed724ecc030666cb5d9826dabaf049806fdee1c8de06099ef42cc3d2: Status 404 returned error can't find the container with id 8340a741ed724ecc030666cb5d9826dabaf049806fdee1c8de06099ef42cc3d2 Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.766463 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pcx4q"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.783190 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-6qqbh" podStartSLOduration=124.783170908 podStartE2EDuration="2m4.783170908s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.735584324 +0000 UTC m=+144.164485185" watchObservedRunningTime="2026-02-18 11:53:56.783170908 +0000 UTC m=+144.212071779" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.791265 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.792036 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.792405 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.292394028 +0000 UTC m=+144.721294889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.797116 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.807333 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.822978 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hjc5f" podStartSLOduration=124.822959032 podStartE2EDuration="2m4.822959032s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.761209507 +0000 UTC m=+144.190110368" watchObservedRunningTime="2026-02-18 11:53:56.822959032 +0000 UTC m=+144.251859893" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.823005 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.865757 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vdp59" podStartSLOduration=123.865730819 podStartE2EDuration="2m3.865730819s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.84696122 +0000 UTC m=+144.275862081" watchObservedRunningTime="2026-02-18 11:53:56.865730819 +0000 UTC m=+144.294631680" Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.867058 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx"] Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.894270 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.894451 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.394425861 +0000 UTC m=+144.823326722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.905277 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:56 crc kubenswrapper[4880]: E0218 11:53:56.906231 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.406214243 +0000 UTC m=+144.835115104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:56 crc kubenswrapper[4880]: W0218 11:53:56.938258 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12162d78_4cd9_4850_b9b6_d9a5bd9bef0d.slice/crio-a7c7824bfdf7a01e0877b12b01e65bd4a59821ee32846403885d87f8ff430df0 WatchSource:0}: Error finding container a7c7824bfdf7a01e0877b12b01e65bd4a59821ee32846403885d87f8ff430df0: Status 404 returned error can't find the container with id a7c7824bfdf7a01e0877b12b01e65bd4a59821ee32846403885d87f8ff430df0 Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.944132 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n4tsp" podStartSLOduration=123.944113194 podStartE2EDuration="2m3.944113194s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.943003033 +0000 UTC m=+144.371903904" watchObservedRunningTime="2026-02-18 11:53:56.944113194 +0000 UTC m=+144.373014055" Feb 18 11:53:56 crc kubenswrapper[4880]: W0218 11:53:56.951843 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20a14e8_19c7_4712_989c_cdf0b422b882.slice/crio-3f6c0e1a654f3e4033b9877a1792f145ad9fd26061c1f325494e5af9f5ba8bd2 WatchSource:0}: Error finding container 3f6c0e1a654f3e4033b9877a1792f145ad9fd26061c1f325494e5af9f5ba8bd2: Status 404 returned error can't find the container with id 3f6c0e1a654f3e4033b9877a1792f145ad9fd26061c1f325494e5af9f5ba8bd2 Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.968343 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8wwmq"] Feb 18 11:53:56 crc kubenswrapper[4880]: W0218 11:53:56.970004 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod178d28de_f513_4c1f_bd35_886b84a1b892.slice/crio-d9e66533175515d3ac0649307f1bc9661aab72ede16db1a4c62e860572362065 WatchSource:0}: Error finding container d9e66533175515d3ac0649307f1bc9661aab72ede16db1a4c62e860572362065: Status 404 returned error can't find the container with id d9e66533175515d3ac0649307f1bc9661aab72ede16db1a4c62e860572362065 Feb 18 11:53:56 crc kubenswrapper[4880]: W0218 11:53:56.993692 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod644e6f4e_dcff_4163_92f5_87d0d42bf9e0.slice/crio-f795887a2109e3814c495c6a95b8578f4ffbc983448fb3077ddf180482aebf1c WatchSource:0}: Error finding container f795887a2109e3814c495c6a95b8578f4ffbc983448fb3077ddf180482aebf1c: Status 404 returned error can't find the container with id f795887a2109e3814c495c6a95b8578f4ffbc983448fb3077ddf180482aebf1c Feb 18 11:53:56 crc kubenswrapper[4880]: I0218 11:53:56.997315 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" podStartSLOduration=124.997301316 podStartE2EDuration="2m4.997301316s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:56.991577325 +0000 UTC m=+144.420478186" watchObservedRunningTime="2026-02-18 11:53:56.997301316 +0000 UTC m=+144.426202177" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.007564 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.007902 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.507798383 +0000 UTC m=+144.936699244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.008256 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.008828 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.508808802 +0000 UTC m=+144.937709663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.025079 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.112182 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.112417 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.612382698 +0000 UTC m=+145.041283559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.113303 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.113911 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.613891611 +0000 UTC m=+145.042792472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.217264 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.217810 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.717782315 +0000 UTC m=+145.146683196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.319690 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.320184 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.820165858 +0000 UTC m=+145.249066709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.385712 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-j7s2s" event={"ID":"f54901d5-c293-48d0-bf46-0d14f035214b","Type":"ContainerStarted","Data":"572a2ed673c8bebe75755e2815c4ccf0f9e534155fa2cc363968d32d445d0cf5"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.397039 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" event={"ID":"9f587184-defe-4d9a-83a8-7cd95f215a55","Type":"ContainerStarted","Data":"b9113f671285ae75984831efe5580212c7ceec142be140baa05d104f4ea857e2"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.421161 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" event={"ID":"bcb7870f-01f8-423b-ac4c-5e2cae04ca25","Type":"ContainerStarted","Data":"895c140137c6ab2f400a5d7fc5a9c0021b8f0a6f0255ce0a2f338be5ba5ebb84"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.424576 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-j7s2s" podStartSLOduration=6.424555757 podStartE2EDuration="6.424555757s" podCreationTimestamp="2026-02-18 11:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.4098038 +0000 UTC m=+144.838704661" watchObservedRunningTime="2026-02-18 11:53:57.424555757 +0000 UTC m=+144.853456618" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.424729 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.424880 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.924857845 +0000 UTC m=+145.353758706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.426520 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.427653 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" event={"ID":"590f26fc-f2c5-47c7-96c1-217f8fb36f67","Type":"ContainerStarted","Data":"8340a741ed724ecc030666cb5d9826dabaf049806fdee1c8de06099ef42cc3d2"} Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.428533 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:57.928512488 +0000 UTC m=+145.357413349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.430677 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" event={"ID":"71911446-2b7a-45cb-8dc5-277b629ee152","Type":"ContainerStarted","Data":"d310e3e9df6d6762d5ad3f410d80767e6de0f554adef202b5ba7b34148296ab1"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.442567 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lrf5s" podStartSLOduration=125.442541594 podStartE2EDuration="2m5.442541594s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.435429483 +0000 UTC m=+144.864330344" watchObservedRunningTime="2026-02-18 11:53:57.442541594 +0000 UTC m=+144.871442455" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.463385 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" event={"ID":"7c284d25-2b25-48b6-a3b7-d4178e4db154","Type":"ContainerStarted","Data":"59c400a2d7e49c3253ece3c147a1322080a36bcc6bfd15ef5f8ca6b2d6bc892e"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.463452 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" event={"ID":"7c284d25-2b25-48b6-a3b7-d4178e4db154","Type":"ContainerStarted","Data":"57e02ea6184ad2d6d840c441b1db50a669c9f6ca8ff3105c7bbcc79697b6a7e9"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.473377 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" event={"ID":"f8561646-dee0-44aa-a718-fdeaaf0ff34b","Type":"ContainerStarted","Data":"69d8bf3fd89b08565055f58d9b3373aa6860424f0cffa5fc5a23f7899b7e8c14"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.524400 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" event={"ID":"2b7f342a-8424-4d37-96d3-de3851c1de4f","Type":"ContainerStarted","Data":"578325c1b34a9d69a498ba37a4d663ab9bcd8c540d292437235a8612ce56411f"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.528072 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zwwmm" podStartSLOduration=125.528047941 podStartE2EDuration="2m5.528047941s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.460228234 +0000 UTC m=+144.889129105" watchObservedRunningTime="2026-02-18 11:53:57.528047941 +0000 UTC m=+144.956948802" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.534269 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.535823 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.035807409 +0000 UTC m=+145.464708270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.544150 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" event={"ID":"c5edee35-4ff0-4da6-8db9-8cc26218d3f7","Type":"ContainerStarted","Data":"8486809add96c0f3dcd6b63dc068edc5990ef3df875895d7ab84942afde2f85d"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.549914 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ss7t9" podStartSLOduration=124.549888057 podStartE2EDuration="2m4.549888057s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.496832188 +0000 UTC m=+144.925733069" watchObservedRunningTime="2026-02-18 11:53:57.549888057 +0000 UTC m=+144.978788918" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.565296 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pt2m" event={"ID":"a20a14e8-19c7-4712-989c-cdf0b422b882","Type":"ContainerStarted","Data":"3f6c0e1a654f3e4033b9877a1792f145ad9fd26061c1f325494e5af9f5ba8bd2"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.577335 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t4jzt" podStartSLOduration=125.577317322 podStartE2EDuration="2m5.577317322s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.575245023 +0000 UTC m=+145.004145904" watchObservedRunningTime="2026-02-18 11:53:57.577317322 +0000 UTC m=+145.006218183" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.583073 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" event={"ID":"1cb59275-0ea1-401b-b631-cf092f94b742","Type":"ContainerStarted","Data":"f3dd905c7c6736faea6b4109d9123db2bdec1d28b5742ec1c82caf1cc460848a"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.583926 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.596668 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" event={"ID":"594db65b-f9dd-4b9b-a572-e52e3d5a5875","Type":"ContainerStarted","Data":"38dfd7fe16029663972cc24f1099f57cbde36e2306b654724ad53c0683753dd5"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.608362 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" event={"ID":"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d","Type":"ContainerStarted","Data":"a7c7824bfdf7a01e0877b12b01e65bd4a59821ee32846403885d87f8ff430df0"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.624824 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" podStartSLOduration=125.624790834 podStartE2EDuration="2m5.624790834s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.623451446 +0000 UTC m=+145.052352337" watchObservedRunningTime="2026-02-18 11:53:57.624790834 +0000 UTC m=+145.053691695" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.631212 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" event={"ID":"fa1dfb09-2907-4e18-a008-e74e9fc62482","Type":"ContainerStarted","Data":"67dfa55722e29f7edbe1e2cf7eb6f472c84d5ae852602e61ae7dce39ba69af56"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.631297 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.634111 4880 patch_prober.go:28] interesting pod/console-operator-58897d9998-sqwt7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.634216 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" podUID="fa1dfb09-2907-4e18-a008-e74e9fc62482" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.634406 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" event={"ID":"c34a3987-0ad6-4165-a6cf-717d47fea5fb","Type":"ContainerStarted","Data":"5dcb1b7c2dde4810da9af17461c85a7e7b64be9d753c6536693ccc24349354f7"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.634542 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.635558 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.636260 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.136233927 +0000 UTC m=+145.565134788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.636778 4880 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xzmw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.636935 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.643200 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" podStartSLOduration=124.643102981 podStartE2EDuration="2m4.643102981s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.641563007 +0000 UTC m=+145.070463868" watchObservedRunningTime="2026-02-18 11:53:57.643102981 +0000 UTC m=+145.072003842" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.649943 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" event={"ID":"ccde08de-7978-48ba-a59b-34a979ab7fa8","Type":"ContainerStarted","Data":"f5f79428ee007be92cb267cdfacbdccb01dffcbedabe99ef297aacddfbb2ad1c"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.704439 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:53:57 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:53:57 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:53:57 crc kubenswrapper[4880]: healthz check failed Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.704494 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.707111 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" podStartSLOduration=125.707093218 podStartE2EDuration="2m5.707093218s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.704177466 +0000 UTC m=+145.133078347" watchObservedRunningTime="2026-02-18 11:53:57.707093218 +0000 UTC m=+145.135994079" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.708132 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" podStartSLOduration=124.708123077 podStartE2EDuration="2m4.708123077s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.668330493 +0000 UTC m=+145.097231354" watchObservedRunningTime="2026-02-18 11:53:57.708123077 +0000 UTC m=+145.137023948" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.731277 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xhznq" podStartSLOduration=124.731252381 podStartE2EDuration="2m4.731252381s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.726498476 +0000 UTC m=+145.155399357" watchObservedRunningTime="2026-02-18 11:53:57.731252381 +0000 UTC m=+145.160153242" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.739383 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.739922 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.239889455 +0000 UTC m=+145.668790316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.740293 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.742234 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.24222272 +0000 UTC m=+145.671123571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.770012 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" event={"ID":"35491756-4b45-4afa-a300-e8e55fa4b0c4","Type":"ContainerStarted","Data":"e3a8f7c3d6befe331e6b5deea45540c35964041b3d516c513a82d0e13a69bfa2"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.791505 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-g46bq" podStartSLOduration=124.791484103 podStartE2EDuration="2m4.791484103s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.788113497 +0000 UTC m=+145.217014358" watchObservedRunningTime="2026-02-18 11:53:57.791484103 +0000 UTC m=+145.220384964" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.800485 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" event={"ID":"7bc4b180-6c06-4d90-8eb7-b096bc72b76a","Type":"ContainerStarted","Data":"e03bcece23e30f4c01029421c4e839c99be6b9ec13b2eda043525d00c5e1a558"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.833770 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" event={"ID":"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a","Type":"ContainerStarted","Data":"acf94a797a4338797a362671d4f09f200c4ba7d59ee494ab2db0f2042b524c5c"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.835119 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.843715 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jqn5r" podStartSLOduration=124.843698277 podStartE2EDuration="2m4.843698277s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.840284501 +0000 UTC m=+145.269185362" watchObservedRunningTime="2026-02-18 11:53:57.843698277 +0000 UTC m=+145.272599138" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.844780 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.844875 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.34485484 +0000 UTC m=+145.773755701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.845694 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.847198 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.347186915 +0000 UTC m=+145.776087776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.879878 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" podStartSLOduration=124.879864709 podStartE2EDuration="2m4.879864709s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.87883891 +0000 UTC m=+145.307739771" watchObservedRunningTime="2026-02-18 11:53:57.879864709 +0000 UTC m=+145.308765570" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.881118 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-stc5w" event={"ID":"4cab1e2b-abe4-4e53-ada6-d77879bcde5e","Type":"ContainerStarted","Data":"7d769f15bc21c5780d1afba5c234ba8a64e5d05ca680893c2f1de19ee1fe9d32"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.881154 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-stc5w" event={"ID":"4cab1e2b-abe4-4e53-ada6-d77879bcde5e","Type":"ContainerStarted","Data":"ba6119ac32170a90794f5235d8a48bddf3bad7b4fe4f5500b27e9198a512157d"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.921699 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" event={"ID":"644e6f4e-dcff-4163-92f5-87d0d42bf9e0","Type":"ContainerStarted","Data":"f795887a2109e3814c495c6a95b8578f4ffbc983448fb3077ddf180482aebf1c"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.922834 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.923085 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-stc5w" podStartSLOduration=6.92306117 podStartE2EDuration="6.92306117s" podCreationTimestamp="2026-02-18 11:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.922435701 +0000 UTC m=+145.351336592" watchObservedRunningTime="2026-02-18 11:53:57.92306117 +0000 UTC m=+145.351962031" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.927056 4880 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-llwhx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.927118 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" podUID="644e6f4e-dcff-4163-92f5-87d0d42bf9e0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.929941 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" event={"ID":"e74b6ff4-3a1b-4340-9076-992e4a049e8e","Type":"ContainerStarted","Data":"e8c762cd49e128c36ff1cb2e4b00d95c8f181a3d06803e5051f6c608fb8c5c1f"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.943688 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" event={"ID":"502f8aab-255f-468a-a21d-3c1fe4aaf8b2","Type":"ContainerStarted","Data":"02f10853e9f62d2ddf42f6c89487ceed808a66bab7c1c3a3b77027098fba3575"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.950688 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:57 crc kubenswrapper[4880]: E0218 11:53:57.950945 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.450929897 +0000 UTC m=+145.879830758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.953841 4880 generic.go:334] "Generic (PLEG): container finished" podID="d05e33ff-25c2-4f7f-809c-7583945b4e7c" containerID="de6596725fc7ecf5ddcde294cbaf93407176dbe7ccd4d38cf3c1253ef20d5940" exitCode=0 Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.953942 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" event={"ID":"d05e33ff-25c2-4f7f-809c-7583945b4e7c","Type":"ContainerDied","Data":"de6596725fc7ecf5ddcde294cbaf93407176dbe7ccd4d38cf3c1253ef20d5940"} Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.963113 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" podStartSLOduration=124.963074319 podStartE2EDuration="2m4.963074319s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:57.95988892 +0000 UTC m=+145.388789771" watchObservedRunningTime="2026-02-18 11:53:57.963074319 +0000 UTC m=+145.391975180" Feb 18 11:53:57 crc kubenswrapper[4880]: I0218 11:53:57.980017 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" event={"ID":"997abc8f-4776-4edd-9247-5bd0d51823f2","Type":"ContainerStarted","Data":"76c6a5caf98ac29b1ca4b0f549fe75bedb77637b321ab96423805eb61b7a3c8a"} Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.005346 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" event={"ID":"0329fe9d-0f10-43a2-a74b-657ffe52fcf3","Type":"ContainerStarted","Data":"2c172eb8f3a065b28747718ede8d021c8565957708c07a4b0bb673fbfb70afa7"} Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.008354 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" event={"ID":"178d28de-f513-4c1f-bd35-886b84a1b892","Type":"ContainerStarted","Data":"d9e66533175515d3ac0649307f1bc9661aab72ede16db1a4c62e860572362065"} Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.009825 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfz9j" podStartSLOduration=125.009762639 podStartE2EDuration="2m5.009762639s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:58.006488436 +0000 UTC m=+145.435389297" watchObservedRunningTime="2026-02-18 11:53:58.009762639 +0000 UTC m=+145.438663500" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.010836 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" event={"ID":"d45b4ded-00aa-4fcd-b166-59f997447d20","Type":"ContainerStarted","Data":"b830c3bd2a5787dc9167d626799e2a2269c780ca098fad6e8fd2b467f73295ff"} Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.022537 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.023395 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6l7kh" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.045027 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.053277 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.074624 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.57458949 +0000 UTC m=+146.003490351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.164530 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.165224 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.66520767 +0000 UTC m=+146.094108531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.237438 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" podStartSLOduration=125.23741918 podStartE2EDuration="2m5.23741918s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:58.235775763 +0000 UTC m=+145.664676624" watchObservedRunningTime="2026-02-18 11:53:58.23741918 +0000 UTC m=+145.666320041" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.271829 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.272211 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.772195902 +0000 UTC m=+146.201096763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.276773 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" podStartSLOduration=125.27674788 podStartE2EDuration="2m5.27674788s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:58.272786438 +0000 UTC m=+145.701687309" watchObservedRunningTime="2026-02-18 11:53:58.27674788 +0000 UTC m=+145.705648751" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.336211 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.337642 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.339102 4880 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-sdkl4 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.28:8443/livez\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.339172 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" podUID="0329fe9d-0f10-43a2-a74b-657ffe52fcf3" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.28:8443/livez\": dial tcp 10.217.0.28:8443: connect: connection refused" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.372681 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.373311 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.873287488 +0000 UTC m=+146.302188359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.474272 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.474761 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:58.974744614 +0000 UTC m=+146.403645475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.576399 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.576836 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.076814198 +0000 UTC m=+146.505715059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.678075 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.678523 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.17850714 +0000 UTC m=+146.607408001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.680559 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:53:58 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:53:58 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:53:58 crc kubenswrapper[4880]: healthz check failed Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.680626 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.779111 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.779601 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.279583955 +0000 UTC m=+146.708484816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.881028 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.881447 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.381433273 +0000 UTC m=+146.810334134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:58 crc kubenswrapper[4880]: I0218 11:53:58.982518 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:58 crc kubenswrapper[4880]: E0218 11:53:58.982886 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.482870999 +0000 UTC m=+146.911771860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.023293 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pt2m" event={"ID":"a20a14e8-19c7-4712-989c-cdf0b422b882","Type":"ContainerStarted","Data":"ecc08879d24951443a2cb4809b7f5d8d156e24b0aa4922ad8eb3e7108e06aa77"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.023732 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4pt2m" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.023744 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pt2m" event={"ID":"a20a14e8-19c7-4712-989c-cdf0b422b882","Type":"ContainerStarted","Data":"da7b9143aed5c6cda0dec0a59c281d606b1678d42dbf9bf63be514523dba65ee"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.032295 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" event={"ID":"997abc8f-4776-4edd-9247-5bd0d51823f2","Type":"ContainerStarted","Data":"35d70a8bb3a7665cc5af33b490f04bfe27c7b0826d821726519ae5cb068fb753"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.037516 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" event={"ID":"0bf486aa-14ba-4dcb-9af7-ffbd491d5d6a","Type":"ContainerStarted","Data":"291fc3d954dbda6faf4158bffe20071fbd825d0a7c94257060a7316902d53932"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.041238 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" event={"ID":"590f26fc-f2c5-47c7-96c1-217f8fb36f67","Type":"ContainerStarted","Data":"94e0438998886503cc93a1a005218fb3436d0a33dc67e3af48d1d92394fc80bb"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.047393 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-cbjbp" event={"ID":"178d28de-f513-4c1f-bd35-886b84a1b892","Type":"ContainerStarted","Data":"90cf376c9d570402e2cba3e3af85fcf80e6fe122c10d50cfa314f336d760e0fd"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.052084 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4pt2m" podStartSLOduration=8.052067333 podStartE2EDuration="8.052067333s" podCreationTimestamp="2026-02-18 11:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.051149197 +0000 UTC m=+146.480050068" watchObservedRunningTime="2026-02-18 11:53:59.052067333 +0000 UTC m=+146.480968184" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.058632 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" event={"ID":"d45b4ded-00aa-4fcd-b166-59f997447d20","Type":"ContainerStarted","Data":"08c1068b765e4e01010d62a8f88593285a8b060ce25ba199a72009b734caea61"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.058709 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" event={"ID":"d45b4ded-00aa-4fcd-b166-59f997447d20","Type":"ContainerStarted","Data":"6a2551488800de2713d85e9df00f42777ea4606e93966c219fd7d9889e56334c"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.065144 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" event={"ID":"bcb7870f-01f8-423b-ac4c-5e2cae04ca25","Type":"ContainerStarted","Data":"e588a41046d5f5366d951b924cc2d4d8d9c5e9e87e897fba19983247c67685f0"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.069323 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" event={"ID":"e74b6ff4-3a1b-4340-9076-992e4a049e8e","Type":"ContainerStarted","Data":"208e426599f310f79f7f1d7520248af8a52ceb7892ff2fbab53a1eff52f16e48"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.069401 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" event={"ID":"e74b6ff4-3a1b-4340-9076-992e4a049e8e","Type":"ContainerStarted","Data":"8927a524aacf91b35f685f5d166a64f61cde50d3851053e252e6d6459e841d56"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.071732 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" event={"ID":"c34a3987-0ad6-4165-a6cf-717d47fea5fb","Type":"ContainerStarted","Data":"c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.072658 4880 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xzmw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.072693 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.074594 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-429s2" event={"ID":"594db65b-f9dd-4b9b-a572-e52e3d5a5875","Type":"ContainerStarted","Data":"4e3eb12d5e85eafae6736da445dd1311c1171cc0011807a84c353134b9a92eeb"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.075953 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" event={"ID":"12162d78-4cd9-4850-b9b6-d9a5bd9bef0d","Type":"ContainerStarted","Data":"182e1b6e4644d2b059103580d5bbc8b6806fc546c84652d5bd02d6377eaee6ef"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.085819 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b4gvz" podStartSLOduration=126.085797086 podStartE2EDuration="2m6.085797086s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.081421502 +0000 UTC m=+146.510322363" watchObservedRunningTime="2026-02-18 11:53:59.085797086 +0000 UTC m=+146.514697947" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.087695 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.089335 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" event={"ID":"644e6f4e-dcff-4163-92f5-87d0d42bf9e0","Type":"ContainerStarted","Data":"55a3e0e0cb55131916d4d6b0a0f3680ec665d3cef2c786cfc2df644b8d348572"} Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.089979 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.589966804 +0000 UTC m=+147.018867665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.090890 4880 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-llwhx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.090964 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" podUID="644e6f4e-dcff-4163-92f5-87d0d42bf9e0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.101654 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" event={"ID":"d05e33ff-25c2-4f7f-809c-7583945b4e7c","Type":"ContainerStarted","Data":"405a6198dd04f7ea7a25b849e02a137399ad1f86361e453ad28330258959800a"} Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.106893 4880 patch_prober.go:28] interesting pod/console-operator-58897d9998-sqwt7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.106966 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" podUID="fa1dfb09-2907-4e18-a008-e74e9fc62482" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.112642 4880 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vdwwz container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.112722 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" podUID="1cb59275-0ea1-401b-b631-cf092f94b742" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.118509 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" podStartSLOduration=127.118468429 podStartE2EDuration="2m7.118468429s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.112978733 +0000 UTC m=+146.541879604" watchObservedRunningTime="2026-02-18 11:53:59.118468429 +0000 UTC m=+146.547369300" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.162239 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cdwn2" podStartSLOduration=126.162221385 podStartE2EDuration="2m6.162221385s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.159720935 +0000 UTC m=+146.588621796" watchObservedRunningTime="2026-02-18 11:53:59.162221385 +0000 UTC m=+146.591122246" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.188691 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.188908 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.688873298 +0000 UTC m=+147.117774169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.189486 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.207066 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.707044812 +0000 UTC m=+147.135945673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.209694 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xgvx6" podStartSLOduration=126.209674716 podStartE2EDuration="2m6.209674716s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.206243208 +0000 UTC m=+146.635144069" watchObservedRunningTime="2026-02-18 11:53:59.209674716 +0000 UTC m=+146.638575577" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.293185 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4nq8f" podStartSLOduration=126.293159264 podStartE2EDuration="2m6.293159264s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.290977473 +0000 UTC m=+146.719878364" watchObservedRunningTime="2026-02-18 11:53:59.293159264 +0000 UTC m=+146.722060125" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.295887 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pcx4q" podStartSLOduration=126.295826439 podStartE2EDuration="2m6.295826439s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.245320393 +0000 UTC m=+146.674221254" watchObservedRunningTime="2026-02-18 11:53:59.295826439 +0000 UTC m=+146.724727310" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.296269 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.296591 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.796574201 +0000 UTC m=+147.225475062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.324374 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9ql7r" podStartSLOduration=126.324356985 podStartE2EDuration="2m6.324356985s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:53:59.324256832 +0000 UTC m=+146.753157713" watchObservedRunningTime="2026-02-18 11:53:59.324356985 +0000 UTC m=+146.753257846" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.402445 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.402886 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:53:59.902873774 +0000 UTC m=+147.331774635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.486028 4880 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vdwwz container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.486502 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" podUID="1cb59275-0ea1-401b-b631-cf092f94b742" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.486150 4880 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vdwwz container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.486745 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" podUID="1cb59275-0ea1-401b-b631-cf092f94b742" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.504710 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.504969 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.004931556 +0000 UTC m=+147.433832427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.505065 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.505545 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.005534174 +0000 UTC m=+147.434435055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.606666 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.606939 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.106891567 +0000 UTC m=+147.535792428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.680884 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:53:59 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:53:59 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:53:59 crc kubenswrapper[4880]: healthz check failed Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.680986 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.709127 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.709706 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.209678621 +0000 UTC m=+147.638579482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.810513 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.810741 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.310706285 +0000 UTC m=+147.739607146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.811009 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.811452 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.311440355 +0000 UTC m=+147.740341226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.911820 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.912025 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.411994796 +0000 UTC m=+147.840895657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:53:59 crc kubenswrapper[4880]: I0218 11:53:59.912185 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:53:59 crc kubenswrapper[4880]: E0218 11:53:59.912490 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.4124791 +0000 UTC m=+147.841379961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.014075 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.014345 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.514302506 +0000 UTC m=+147.943203367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.014482 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.014571 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.014645 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.015140 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.515121109 +0000 UTC m=+147.944021970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.016809 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.041680 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.098880 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.108102 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" event={"ID":"2b7f342a-8424-4d37-96d3-de3851c1de4f","Type":"ContainerStarted","Data":"e8eb9b8156b4147b779c09abaf3094aea8a5b4b104872aa2f3fc4be444fb3495"} Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.115954 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.116285 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.116327 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.117202 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.617167572 +0000 UTC m=+148.046068613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.121380 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.129564 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.130871 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" event={"ID":"d05e33ff-25c2-4f7f-809c-7583945b4e7c","Type":"ContainerStarted","Data":"9de729498f1e2d0bfbeeb40b3382b1c88501b810386a4ee237d8d4449884a962"} Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.131845 4880 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xzmw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.131912 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.132511 4880 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-llwhx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.132581 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" podUID="644e6f4e-dcff-4163-92f5-87d0d42bf9e0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.220000 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.221214 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.721198301 +0000 UTC m=+148.150099162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.267196 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" podStartSLOduration=128.26718098 podStartE2EDuration="2m8.26718098s" podCreationTimestamp="2026-02-18 11:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:00.265472642 +0000 UTC m=+147.694373503" watchObservedRunningTime="2026-02-18 11:54:00.26718098 +0000 UTC m=+147.696081841" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.322189 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.322516 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.822499722 +0000 UTC m=+148.251400583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.413436 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.422682 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.423878 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.424236 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:00.924221536 +0000 UTC m=+148.353122397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.483923 4880 csr.go:261] certificate signing request csr-t6tjm is approved, waiting to be issued Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.495651 4880 csr.go:257] certificate signing request csr-t6tjm is issued Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.539475 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.539790 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.039773441 +0000 UTC m=+148.468674292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.540063 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.540496 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.040488931 +0000 UTC m=+148.469389792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.644142 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.644357 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.144307153 +0000 UTC m=+148.573208014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.644449 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.645023 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.145013684 +0000 UTC m=+148.573914535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.681627 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:00 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:00 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:00 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.681692 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:00 crc kubenswrapper[4880]: W0218 11:54:00.719521 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-c3683d9550f1b1a27afb92b5696f9c2bb287fcd1d1d7bad1627c1912f5b5406b WatchSource:0}: Error finding container c3683d9550f1b1a27afb92b5696f9c2bb287fcd1d1d7bad1627c1912f5b5406b: Status 404 returned error can't find the container with id c3683d9550f1b1a27afb92b5696f9c2bb287fcd1d1d7bad1627c1912f5b5406b Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.745114 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.745502 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.245484271 +0000 UTC m=+148.674385132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.853280 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.854066 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.354053169 +0000 UTC m=+148.782954030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:00 crc kubenswrapper[4880]: I0218 11:54:00.963102 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:00 crc kubenswrapper[4880]: E0218 11:54:00.963483 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.46346696 +0000 UTC m=+148.892367821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.066239 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.067082 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.567069906 +0000 UTC m=+148.995970767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.168304 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.168590 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.668573764 +0000 UTC m=+149.097474625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.210887 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c3683d9550f1b1a27afb92b5696f9c2bb287fcd1d1d7bad1627c1912f5b5406b"} Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.279628 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.300236 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.800209783 +0000 UTC m=+149.229110644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.387501 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.388397 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.888372584 +0000 UTC m=+149.317273445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.490033 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.490801 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:01.990787267 +0000 UTC m=+149.419688128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.520746 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-18 11:49:00 +0000 UTC, rotation deadline is 2026-11-03 02:06:54.403084986 +0000 UTC Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.520789 4880 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6182h12m52.882298912s for next certificate rotation Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.591471 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.591910 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.091887273 +0000 UTC m=+149.520788124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.682879 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:01 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:01 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:01 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.682959 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.692766 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.693073 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.19306118 +0000 UTC m=+149.621962041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.793912 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.794277 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.2942607 +0000 UTC m=+149.723161561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.895597 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.895908 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.395896971 +0000 UTC m=+149.824797832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.996983 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.997156 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.49712982 +0000 UTC m=+149.926030681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:01 crc kubenswrapper[4880]: I0218 11:54:01.997259 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:01 crc kubenswrapper[4880]: E0218 11:54:01.997599 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.497581793 +0000 UTC m=+149.926482674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.100714 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.101240 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.601217411 +0000 UTC m=+150.030118272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.202134 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.202439 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.70242806 +0000 UTC m=+150.131328921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.212713 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f71c2942020515224a1fdf35815e4f4f54920a3b6dab17a092deeca0f65753b3"} Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.215279 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d859d318f09101145c670740253a25eb4300cd691e66c7ff60da0107e4ea67c2"} Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.215307 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"560d2d3a925e0dac6f24262870f5f408bc9135c237fff26185d496d111edbc07"} Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.215650 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.216939 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1782e652aaf558e7492a66f1daaf2378cdb8643d1827291df2cd392f3bd9c9fb"} Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.216962 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6b0ac74944425f3325d8365a74cf13ba08e195ca348c383c02a4bee9a0a62f3f"} Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.304365 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.305352 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.805335608 +0000 UTC m=+150.234236459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.406314 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.406696 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:02.90668072 +0000 UTC m=+150.335581581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.505956 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vdwwz" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.506918 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.507055 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.007038246 +0000 UTC m=+150.435939107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.507126 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.507369 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.007361965 +0000 UTC m=+150.436262826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.570020 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5zrsr"] Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.571437 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.577106 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.586167 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zrsr"] Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.608425 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.609768 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.109748407 +0000 UTC m=+150.538649278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.683079 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:02 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:02 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:02 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.683202 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.710209 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wnc\" (UniqueName: \"kubernetes.io/projected/e333b514-7367-498c-9660-500e02cfb188-kube-api-access-c8wnc\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.710291 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-catalog-content\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.710319 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-utilities\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.710359 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.710829 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.210786381 +0000 UTC m=+150.639687312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.754189 4880 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.761455 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bsnd4"] Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.762586 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.765036 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.774819 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsnd4"] Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.811427 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.811767 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.311736833 +0000 UTC m=+150.740637694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.811969 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-catalog-content\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.811994 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-utilities\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.812032 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.812113 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wnc\" (UniqueName: \"kubernetes.io/projected/e333b514-7367-498c-9660-500e02cfb188-kube-api-access-c8wnc\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.812744 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.312731651 +0000 UTC m=+150.741632512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.813316 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-utilities\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.813472 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-catalog-content\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.853218 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wnc\" (UniqueName: \"kubernetes.io/projected/e333b514-7367-498c-9660-500e02cfb188-kube-api-access-c8wnc\") pod \"community-operators-5zrsr\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.911316 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.919251 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.919555 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.419522058 +0000 UTC m=+150.848422919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.919641 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.919748 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-utilities\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.920344 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fssnx\" (UniqueName: \"kubernetes.io/projected/40e949a6-d734-42e6-9423-8597f6d4c9de-kube-api-access-fssnx\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.920395 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-catalog-content\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:02 crc kubenswrapper[4880]: E0218 11:54:02.920561 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.420541977 +0000 UTC m=+150.849442838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.962908 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bxlz8"] Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.964013 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:02 crc kubenswrapper[4880]: I0218 11:54:02.987573 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxlz8"] Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.029209 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:03 crc kubenswrapper[4880]: E0218 11:54:03.029510 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.529477014 +0000 UTC m=+150.958377875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.029574 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fssnx\" (UniqueName: \"kubernetes.io/projected/40e949a6-d734-42e6-9423-8597f6d4c9de-kube-api-access-fssnx\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.029644 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-catalog-content\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.029802 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.029854 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-utilities\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.030361 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-utilities\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:03 crc kubenswrapper[4880]: E0218 11:54:03.030542 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.530522043 +0000 UTC m=+150.959422904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.031511 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-catalog-content\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.049925 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fssnx\" (UniqueName: \"kubernetes.io/projected/40e949a6-d734-42e6-9423-8597f6d4c9de-kube-api-access-fssnx\") pod \"certified-operators-bsnd4\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.122134 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.132480 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:03 crc kubenswrapper[4880]: E0218 11:54:03.132700 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.632660308 +0000 UTC m=+151.061561169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.132885 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-catalog-content\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.132959 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9b7d\" (UniqueName: \"kubernetes.io/projected/1f881593-be88-4dca-a7df-6b287588efbb-kube-api-access-j9b7d\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.133016 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.133054 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-utilities\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: E0218 11:54:03.133412 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.6333952 +0000 UTC m=+151.062296061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.158274 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.159299 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.173352 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.173453 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.190952 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.197054 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6dl89"] Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.198239 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.201700 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zrsr"] Feb 18 11:54:03 crc kubenswrapper[4880]: W0218 11:54:03.216052 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode333b514_7367_498c_9660_500e02cfb188.slice/crio-df90c46bab9866bbdfbc6323b0766f989109dbdd6dd9833cb4031fd3b6f237ef WatchSource:0}: Error finding container df90c46bab9866bbdfbc6323b0766f989109dbdd6dd9833cb4031fd3b6f237ef: Status 404 returned error can't find the container with id df90c46bab9866bbdfbc6323b0766f989109dbdd6dd9833cb4031fd3b6f237ef Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.219445 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6dl89"] Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.244019 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.244486 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-catalog-content\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.244545 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9b7d\" (UniqueName: \"kubernetes.io/projected/1f881593-be88-4dca-a7df-6b287588efbb-kube-api-access-j9b7d\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.244572 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c77bccf-f2dd-482e-9191-f86f0de73c74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.244621 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c77bccf-f2dd-482e-9191-f86f0de73c74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.244671 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-utilities\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.245032 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-utilities\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: E0218 11:54:03.245110 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.745096395 +0000 UTC m=+151.173997256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.245335 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-catalog-content\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.250569 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrsr" event={"ID":"e333b514-7367-498c-9660-500e02cfb188","Type":"ContainerStarted","Data":"df90c46bab9866bbdfbc6323b0766f989109dbdd6dd9833cb4031fd3b6f237ef"} Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.275918 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9b7d\" (UniqueName: \"kubernetes.io/projected/1f881593-be88-4dca-a7df-6b287588efbb-kube-api-access-j9b7d\") pod \"community-operators-bxlz8\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.288843 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" event={"ID":"2b7f342a-8424-4d37-96d3-de3851c1de4f","Type":"ContainerStarted","Data":"ea766bcf427809bd7cf812142943869eede741dd4ad02fb292a683228dc60bbf"} Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.288921 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" event={"ID":"2b7f342a-8424-4d37-96d3-de3851c1de4f","Type":"ContainerStarted","Data":"6e2281be97bdd571b9bf206be8425f580c0131f72389179fb15ead92514f15de"} Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.288936 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" event={"ID":"2b7f342a-8424-4d37-96d3-de3851c1de4f","Type":"ContainerStarted","Data":"7b7669c59c31a6242ea6bd16702e0d478970f5410e3829e16588173c8b4d2c93"} Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.308702 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.308776 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.308943 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.309080 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.315413 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-8wwmq" podStartSLOduration=12.315390731 podStartE2EDuration="12.315390731s" podCreationTimestamp="2026-02-18 11:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:03.314241578 +0000 UTC m=+150.743142429" watchObservedRunningTime="2026-02-18 11:54:03.315390731 +0000 UTC m=+150.744291592" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.320210 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.347500 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-utilities\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.347580 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nk7z\" (UniqueName: \"kubernetes.io/projected/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-kube-api-access-4nk7z\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.347641 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c77bccf-f2dd-482e-9191-f86f0de73c74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.348210 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c77bccf-f2dd-482e-9191-f86f0de73c74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.347683 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.348341 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-catalog-content\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.348423 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c77bccf-f2dd-482e-9191-f86f0de73c74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: E0218 11:54:03.349244 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.849227587 +0000 UTC m=+151.278128448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bj57p" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.351916 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.378483 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c77bccf-f2dd-482e-9191-f86f0de73c74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.400418 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sdkl4" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.400728 4880 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-18T11:54:02.754218238Z","Handler":null,"Name":""} Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.452062 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.452833 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-catalog-content\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: E0218 11:54:03.452938 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 11:54:03.952900395 +0000 UTC m=+151.381801246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.452963 4880 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.453084 4880 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.453201 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-utilities\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.453268 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-catalog-content\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.453292 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nk7z\" (UniqueName: \"kubernetes.io/projected/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-kube-api-access-4nk7z\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.453358 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.453680 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-utilities\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.470650 4880 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.470696 4880 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.496118 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.499540 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsnd4"] Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.499698 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nk7z\" (UniqueName: \"kubernetes.io/projected/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-kube-api-access-4nk7z\") pod \"certified-operators-6dl89\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.515546 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.532553 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.535439 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.547120 4880 patch_prober.go:28] interesting pod/console-f9d7485db-6qqbh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.547193 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6qqbh" podUID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.568809 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.570368 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.586276 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bj57p\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.596751 4880 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wb62r container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]log ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]etcd ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/generic-apiserver-start-informers ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/max-in-flight-filter ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 18 11:54:03 crc kubenswrapper[4880]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 18 11:54:03 crc kubenswrapper[4880]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/project.openshift.io-projectcache ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/openshift.io-startinformers ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 18 11:54:03 crc kubenswrapper[4880]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 18 11:54:03 crc kubenswrapper[4880]: livez check failed Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.596829 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" podUID="d05e33ff-25c2-4f7f-809c-7583945b4e7c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.637502 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sqwt7" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.660624 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.686630 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.696389 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.696655 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:03 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:03 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:03 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.696707 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.772513 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.878214 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6dl89"] Feb 18 11:54:03 crc kubenswrapper[4880]: W0218 11:54:03.903103 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a68f0fc_e36e_4682_a3d4_3885ec4b473f.slice/crio-4d4479d7e597ba61b58ce60a6204e8204fd157b12b7b48abb254f3c71197cd5e WatchSource:0}: Error finding container 4d4479d7e597ba61b58ce60a6204e8204fd157b12b7b48abb254f3c71197cd5e: Status 404 returned error can't find the container with id 4d4479d7e597ba61b58ce60a6204e8204fd157b12b7b48abb254f3c71197cd5e Feb 18 11:54:03 crc kubenswrapper[4880]: I0218 11:54:03.965949 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxlz8"] Feb 18 11:54:04 crc kubenswrapper[4880]: W0218 11:54:04.011706 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f881593_be88_4dca_a7df_6b287588efbb.slice/crio-756399d8096a590a0519ceafa8bb47545621deb9b23dff050e830109105ca2b5 WatchSource:0}: Error finding container 756399d8096a590a0519ceafa8bb47545621deb9b23dff050e830109105ca2b5: Status 404 returned error can't find the container with id 756399d8096a590a0519ceafa8bb47545621deb9b23dff050e830109105ca2b5 Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.131447 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.265210 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bj57p"] Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.305328 4880 generic.go:334] "Generic (PLEG): container finished" podID="590f26fc-f2c5-47c7-96c1-217f8fb36f67" containerID="94e0438998886503cc93a1a005218fb3436d0a33dc67e3af48d1d92394fc80bb" exitCode=0 Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.305440 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" event={"ID":"590f26fc-f2c5-47c7-96c1-217f8fb36f67","Type":"ContainerDied","Data":"94e0438998886503cc93a1a005218fb3436d0a33dc67e3af48d1d92394fc80bb"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.309586 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c77bccf-f2dd-482e-9191-f86f0de73c74","Type":"ContainerStarted","Data":"2819eeec69d3bb58e0b54e6c37f716ddbabab5dcba71f85dae4bfc4323c32786"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.312133 4880 generic.go:334] "Generic (PLEG): container finished" podID="1f881593-be88-4dca-a7df-6b287588efbb" containerID="5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580" exitCode=0 Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.312201 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxlz8" event={"ID":"1f881593-be88-4dca-a7df-6b287588efbb","Type":"ContainerDied","Data":"5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.312221 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxlz8" event={"ID":"1f881593-be88-4dca-a7df-6b287588efbb","Type":"ContainerStarted","Data":"756399d8096a590a0519ceafa8bb47545621deb9b23dff050e830109105ca2b5"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.314512 4880 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.315273 4880 generic.go:334] "Generic (PLEG): container finished" podID="e333b514-7367-498c-9660-500e02cfb188" containerID="1929907c176b3fbd8247311ed5caadebfadcc54bfe10c0d2000e89cf2dbe61e1" exitCode=0 Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.315330 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrsr" event={"ID":"e333b514-7367-498c-9660-500e02cfb188","Type":"ContainerDied","Data":"1929907c176b3fbd8247311ed5caadebfadcc54bfe10c0d2000e89cf2dbe61e1"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.323847 4880 generic.go:334] "Generic (PLEG): container finished" podID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerID="5c901d2364328bd2f2d119d06037af4cab4f981a6053601304d341ee3d05b90a" exitCode=0 Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.323962 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnd4" event={"ID":"40e949a6-d734-42e6-9423-8597f6d4c9de","Type":"ContainerDied","Data":"5c901d2364328bd2f2d119d06037af4cab4f981a6053601304d341ee3d05b90a"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.323991 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnd4" event={"ID":"40e949a6-d734-42e6-9423-8597f6d4c9de","Type":"ContainerStarted","Data":"89ae412976a8ddaea91d832224ed7cfd80f920d0e3e9116982e656ab66f9dd01"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.330987 4880 generic.go:334] "Generic (PLEG): container finished" podID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerID="0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a" exitCode=0 Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.331653 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dl89" event={"ID":"0a68f0fc-e36e-4682-a3d4-3885ec4b473f","Type":"ContainerDied","Data":"0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.331709 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dl89" event={"ID":"0a68f0fc-e36e-4682-a3d4-3885ec4b473f","Type":"ContainerStarted","Data":"4d4479d7e597ba61b58ce60a6204e8204fd157b12b7b48abb254f3c71197cd5e"} Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.566466 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.678724 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:04 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:04 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:04 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.678805 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.762731 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rr7dq"] Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.764115 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.765865 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.776013 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rr7dq"] Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.882756 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-catalog-content\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.882818 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-utilities\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.882887 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qh96\" (UniqueName: \"kubernetes.io/projected/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-kube-api-access-5qh96\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.926796 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llwhx" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.984623 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-catalog-content\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.984687 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-utilities\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.984809 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qh96\" (UniqueName: \"kubernetes.io/projected/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-kube-api-access-5qh96\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.985319 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-catalog-content\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:04 crc kubenswrapper[4880]: I0218 11:54:04.985529 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-utilities\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.015929 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qh96\" (UniqueName: \"kubernetes.io/projected/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-kube-api-access-5qh96\") pod \"redhat-marketplace-rr7dq\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.080007 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.166428 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xpv8s"] Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.167689 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.198453 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.209520 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpv8s"] Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.288503 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-utilities\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.289136 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp4mz\" (UniqueName: \"kubernetes.io/projected/1dd2f4dc-890d-485f-a29a-caba7ea13cde-kube-api-access-xp4mz\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.289418 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-catalog-content\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.357360 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" event={"ID":"a7f1515f-71f7-47dc-b429-1ce17401b9ea","Type":"ContainerStarted","Data":"ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7"} Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.357436 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" event={"ID":"a7f1515f-71f7-47dc-b429-1ce17401b9ea","Type":"ContainerStarted","Data":"f822f50dd490787ff4a1b5b605fa1230e42285702a05a3b34552a225ac9ecc6e"} Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.357567 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.365544 4880 generic.go:334] "Generic (PLEG): container finished" podID="9c77bccf-f2dd-482e-9191-f86f0de73c74" containerID="441a2cded6f12c89048907bb2cc30997c5dd4648fc555be3003e992bea322e2b" exitCode=0 Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.365964 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c77bccf-f2dd-482e-9191-f86f0de73c74","Type":"ContainerDied","Data":"441a2cded6f12c89048907bb2cc30997c5dd4648fc555be3003e992bea322e2b"} Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.390774 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp4mz\" (UniqueName: \"kubernetes.io/projected/1dd2f4dc-890d-485f-a29a-caba7ea13cde-kube-api-access-xp4mz\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.390887 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-catalog-content\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.390928 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-utilities\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.393935 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-catalog-content\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.419365 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" podStartSLOduration=132.419340027 podStartE2EDuration="2m12.419340027s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:05.388920728 +0000 UTC m=+152.817821609" watchObservedRunningTime="2026-02-18 11:54:05.419340027 +0000 UTC m=+152.848240888" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.419480 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-utilities\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.442290 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rr7dq"] Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.444729 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp4mz\" (UniqueName: \"kubernetes.io/projected/1dd2f4dc-890d-485f-a29a-caba7ea13cde-kube-api-access-xp4mz\") pod \"redhat-marketplace-xpv8s\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.503146 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.680950 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:05 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:05 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:05 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.681277 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.746381 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.764591 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cbfgr"] Feb 18 11:54:05 crc kubenswrapper[4880]: E0218 11:54:05.765195 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590f26fc-f2c5-47c7-96c1-217f8fb36f67" containerName="collect-profiles" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.765336 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="590f26fc-f2c5-47c7-96c1-217f8fb36f67" containerName="collect-profiles" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.765680 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="590f26fc-f2c5-47c7-96c1-217f8fb36f67" containerName="collect-profiles" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.773726 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.777887 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.784102 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbfgr"] Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.902485 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr66w\" (UniqueName: \"kubernetes.io/projected/590f26fc-f2c5-47c7-96c1-217f8fb36f67-kube-api-access-cr66w\") pod \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.902577 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/590f26fc-f2c5-47c7-96c1-217f8fb36f67-config-volume\") pod \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.902681 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/590f26fc-f2c5-47c7-96c1-217f8fb36f67-secret-volume\") pod \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\" (UID: \"590f26fc-f2c5-47c7-96c1-217f8fb36f67\") " Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.903998 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590f26fc-f2c5-47c7-96c1-217f8fb36f67-config-volume" (OuterVolumeSpecName: "config-volume") pod "590f26fc-f2c5-47c7-96c1-217f8fb36f67" (UID: "590f26fc-f2c5-47c7-96c1-217f8fb36f67"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.904083 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-catalog-content\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.904122 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjcv\" (UniqueName: \"kubernetes.io/projected/95755e2d-9c90-48ce-8141-7c209fd6d193-kube-api-access-9pjcv\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.904438 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-utilities\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.904634 4880 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/590f26fc-f2c5-47c7-96c1-217f8fb36f67-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.910175 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590f26fc-f2c5-47c7-96c1-217f8fb36f67-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "590f26fc-f2c5-47c7-96c1-217f8fb36f67" (UID: "590f26fc-f2c5-47c7-96c1-217f8fb36f67"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:54:05 crc kubenswrapper[4880]: I0218 11:54:05.910760 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590f26fc-f2c5-47c7-96c1-217f8fb36f67-kube-api-access-cr66w" (OuterVolumeSpecName: "kube-api-access-cr66w") pod "590f26fc-f2c5-47c7-96c1-217f8fb36f67" (UID: "590f26fc-f2c5-47c7-96c1-217f8fb36f67"). InnerVolumeSpecName "kube-api-access-cr66w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.008156 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-utilities\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.008282 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-catalog-content\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.008314 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjcv\" (UniqueName: \"kubernetes.io/projected/95755e2d-9c90-48ce-8141-7c209fd6d193-kube-api-access-9pjcv\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.008367 4880 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/590f26fc-f2c5-47c7-96c1-217f8fb36f67-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.008386 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr66w\" (UniqueName: \"kubernetes.io/projected/590f26fc-f2c5-47c7-96c1-217f8fb36f67-kube-api-access-cr66w\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.009809 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-utilities\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.010175 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-catalog-content\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.069924 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjcv\" (UniqueName: \"kubernetes.io/projected/95755e2d-9c90-48ce-8141-7c209fd6d193-kube-api-access-9pjcv\") pod \"redhat-operators-cbfgr\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.091335 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.136134 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpv8s"] Feb 18 11:54:06 crc kubenswrapper[4880]: W0218 11:54:06.160549 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd2f4dc_890d_485f_a29a_caba7ea13cde.slice/crio-0717aa4cf40db7c2c8211d31ad17004827d55ba7ee982c02b9cbb32511a9f1e4 WatchSource:0}: Error finding container 0717aa4cf40db7c2c8211d31ad17004827d55ba7ee982c02b9cbb32511a9f1e4: Status 404 returned error can't find the container with id 0717aa4cf40db7c2c8211d31ad17004827d55ba7ee982c02b9cbb32511a9f1e4 Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.166896 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bqgh2"] Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.175218 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqgh2"] Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.175354 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.321919 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-catalog-content\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.322500 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-utilities\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.322568 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmhzx\" (UniqueName: \"kubernetes.io/projected/4166cca9-9e48-45b1-9751-e458a37c2e09-kube-api-access-lmhzx\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.400542 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpv8s" event={"ID":"1dd2f4dc-890d-485f-a29a-caba7ea13cde","Type":"ContainerStarted","Data":"0717aa4cf40db7c2c8211d31ad17004827d55ba7ee982c02b9cbb32511a9f1e4"} Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.408446 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" event={"ID":"590f26fc-f2c5-47c7-96c1-217f8fb36f67","Type":"ContainerDied","Data":"8340a741ed724ecc030666cb5d9826dabaf049806fdee1c8de06099ef42cc3d2"} Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.408501 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523585-rvbgw" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.408530 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8340a741ed724ecc030666cb5d9826dabaf049806fdee1c8de06099ef42cc3d2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.423366 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmhzx\" (UniqueName: \"kubernetes.io/projected/4166cca9-9e48-45b1-9751-e458a37c2e09-kube-api-access-lmhzx\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.423449 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-catalog-content\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.423482 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-utilities\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.424085 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-utilities\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.439648 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-catalog-content\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.444795 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rr7dq" event={"ID":"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7","Type":"ContainerStarted","Data":"d97a34a1755e3a664ffa5eb76f2b70fe7f9e91371f0e4855fedf75c7a9abc744"} Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.484124 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmhzx\" (UniqueName: \"kubernetes.io/projected/4166cca9-9e48-45b1-9751-e458a37c2e09-kube-api-access-lmhzx\") pod \"redhat-operators-bqgh2\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.500752 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbfgr"] Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.505706 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.681383 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:06 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:06 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:06 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.682117 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.735415 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.825227 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqgh2"] Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.830187 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c77bccf-f2dd-482e-9191-f86f0de73c74-kubelet-dir\") pod \"9c77bccf-f2dd-482e-9191-f86f0de73c74\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.830537 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c77bccf-f2dd-482e-9191-f86f0de73c74-kube-api-access\") pod \"9c77bccf-f2dd-482e-9191-f86f0de73c74\" (UID: \"9c77bccf-f2dd-482e-9191-f86f0de73c74\") " Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.830766 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c77bccf-f2dd-482e-9191-f86f0de73c74-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c77bccf-f2dd-482e-9191-f86f0de73c74" (UID: "9c77bccf-f2dd-482e-9191-f86f0de73c74"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.831049 4880 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c77bccf-f2dd-482e-9191-f86f0de73c74-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.836293 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c77bccf-f2dd-482e-9191-f86f0de73c74-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c77bccf-f2dd-482e-9191-f86f0de73c74" (UID: "9c77bccf-f2dd-482e-9191-f86f0de73c74"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.841090 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 11:54:06 crc kubenswrapper[4880]: E0218 11:54:06.841334 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c77bccf-f2dd-482e-9191-f86f0de73c74" containerName="pruner" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.841354 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c77bccf-f2dd-482e-9191-f86f0de73c74" containerName="pruner" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.841495 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c77bccf-f2dd-482e-9191-f86f0de73c74" containerName="pruner" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.841972 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.844039 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.844312 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.849363 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 11:54:06 crc kubenswrapper[4880]: W0218 11:54:06.850008 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4166cca9_9e48_45b1_9751_e458a37c2e09.slice/crio-b99b2dd5da4b3bf2faca1400316389a4a49a3fd37431b2107446e326b804ab15 WatchSource:0}: Error finding container b99b2dd5da4b3bf2faca1400316389a4a49a3fd37431b2107446e326b804ab15: Status 404 returned error can't find the container with id b99b2dd5da4b3bf2faca1400316389a4a49a3fd37431b2107446e326b804ab15 Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.933650 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.933763 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:06 crc kubenswrapper[4880]: I0218 11:54:06.933890 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c77bccf-f2dd-482e-9191-f86f0de73c74-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.034932 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.035007 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.035400 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.053987 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.175966 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.452509 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbfgr" event={"ID":"95755e2d-9c90-48ce-8141-7c209fd6d193","Type":"ContainerStarted","Data":"59f243e58d09443974564189b46ae65b4242341133f9e29debb9954f8d29a538"} Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.458851 4880 generic.go:334] "Generic (PLEG): container finished" podID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerID="ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e" exitCode=0 Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.458909 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rr7dq" event={"ID":"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7","Type":"ContainerDied","Data":"ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e"} Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.466381 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.467017 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c77bccf-f2dd-482e-9191-f86f0de73c74","Type":"ContainerDied","Data":"2819eeec69d3bb58e0b54e6c37f716ddbabab5dcba71f85dae4bfc4323c32786"} Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.467042 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2819eeec69d3bb58e0b54e6c37f716ddbabab5dcba71f85dae4bfc4323c32786" Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.472568 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqgh2" event={"ID":"4166cca9-9e48-45b1-9751-e458a37c2e09","Type":"ContainerStarted","Data":"b99b2dd5da4b3bf2faca1400316389a4a49a3fd37431b2107446e326b804ab15"} Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.490479 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.679741 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:07 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:07 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:07 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:07 crc kubenswrapper[4880]: I0218 11:54:07.680084 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.490519 4880 generic.go:334] "Generic (PLEG): container finished" podID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerID="62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b" exitCode=0 Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.490993 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbfgr" event={"ID":"95755e2d-9c90-48ce-8141-7c209fd6d193","Type":"ContainerDied","Data":"62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b"} Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.494496 4880 generic.go:334] "Generic (PLEG): container finished" podID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerID="61e46be63aec36f4ffdfa316e499442aa885dfd9a07fda914f6a5336aee06352" exitCode=0 Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.494624 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpv8s" event={"ID":"1dd2f4dc-890d-485f-a29a-caba7ea13cde","Type":"ContainerDied","Data":"61e46be63aec36f4ffdfa316e499442aa885dfd9a07fda914f6a5336aee06352"} Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.495861 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9","Type":"ContainerStarted","Data":"bf8d069bd7c08fe1bbb9348a60dfe7ec0c13b5f52423cc5a7ebfc2f6261edacd"} Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.495898 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9","Type":"ContainerStarted","Data":"d028e6016333cb077a5092d54448bdc4b081b7a1ef67d6c9e210334273ff1a71"} Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.519010 4880 generic.go:334] "Generic (PLEG): container finished" podID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerID="c9ac54567a4aea59c2652cb48a4333d3778c982dd9149ce6f1c6cadf03bf0f42" exitCode=0 Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.519100 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqgh2" event={"ID":"4166cca9-9e48-45b1-9751-e458a37c2e09","Type":"ContainerDied","Data":"c9ac54567a4aea59c2652cb48a4333d3778c982dd9149ce6f1c6cadf03bf0f42"} Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.535303 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.535286282 podStartE2EDuration="2.535286282s" podCreationTimestamp="2026-02-18 11:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:08.533028977 +0000 UTC m=+155.961929858" watchObservedRunningTime="2026-02-18 11:54:08.535286282 +0000 UTC m=+155.964187143" Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.580861 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.587772 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wb62r" Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.683979 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:08 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:08 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:08 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:08 crc kubenswrapper[4880]: I0218 11:54:08.684058 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:09 crc kubenswrapper[4880]: I0218 11:54:09.537253 4880 generic.go:334] "Generic (PLEG): container finished" podID="09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9" containerID="bf8d069bd7c08fe1bbb9348a60dfe7ec0c13b5f52423cc5a7ebfc2f6261edacd" exitCode=0 Feb 18 11:54:09 crc kubenswrapper[4880]: I0218 11:54:09.538323 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9","Type":"ContainerDied","Data":"bf8d069bd7c08fe1bbb9348a60dfe7ec0c13b5f52423cc5a7ebfc2f6261edacd"} Feb 18 11:54:09 crc kubenswrapper[4880]: I0218 11:54:09.643079 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4pt2m" Feb 18 11:54:09 crc kubenswrapper[4880]: I0218 11:54:09.678496 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:09 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:09 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:09 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:09 crc kubenswrapper[4880]: I0218 11:54:09.678628 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:10 crc kubenswrapper[4880]: I0218 11:54:10.678629 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:10 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:10 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:10 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:10 crc kubenswrapper[4880]: I0218 11:54:10.678700 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:11 crc kubenswrapper[4880]: I0218 11:54:11.677630 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:11 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:11 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:11 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:11 crc kubenswrapper[4880]: I0218 11:54:11.677700 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:12 crc kubenswrapper[4880]: I0218 11:54:12.678621 4880 patch_prober.go:28] interesting pod/router-default-5444994796-t9dmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 11:54:12 crc kubenswrapper[4880]: [-]has-synced failed: reason withheld Feb 18 11:54:12 crc kubenswrapper[4880]: [+]process-running ok Feb 18 11:54:12 crc kubenswrapper[4880]: healthz check failed Feb 18 11:54:12 crc kubenswrapper[4880]: I0218 11:54:12.678899 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9dmw" podUID="89f16e7c-79df-4d83-85bf-e714bbd768fc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.308973 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.309050 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.309658 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.309745 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.537188 4880 patch_prober.go:28] interesting pod/console-f9d7485db-6qqbh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.537570 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6qqbh" podUID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.678145 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:54:13 crc kubenswrapper[4880]: I0218 11:54:13.681093 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-t9dmw" Feb 18 11:54:15 crc kubenswrapper[4880]: I0218 11:54:15.279118 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:54:15 crc kubenswrapper[4880]: I0218 11:54:15.285958 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6981df1-6d75-41e2-a41e-ac960f0a847a-metrics-certs\") pod \"network-metrics-daemon-nj7dq\" (UID: \"c6981df1-6d75-41e2-a41e-ac960f0a847a\") " pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:54:15 crc kubenswrapper[4880]: I0218 11:54:15.428941 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nj7dq" Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.004148 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.089768 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kube-api-access\") pod \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.089875 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kubelet-dir\") pod \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\" (UID: \"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9\") " Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.089936 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9" (UID: "09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.090129 4880 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.093908 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9" (UID: "09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.191471 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.618062 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9","Type":"ContainerDied","Data":"d028e6016333cb077a5092d54448bdc4b081b7a1ef67d6c9e210334273ff1a71"} Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.618763 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d028e6016333cb077a5092d54448bdc4b081b7a1ef67d6c9e210334273ff1a71" Feb 18 11:54:16 crc kubenswrapper[4880]: I0218 11:54:16.618104 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.274691 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.275402 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.308422 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.308422 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.308486 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.308540 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.308546 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.308870 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.308891 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.309186 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"c51384ba3fd1325809de037e607e8f0a61c882e0bc397bc3a80b22ca6ab5ea08"} pod="openshift-console/downloads-7954f5f757-6n78q" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.309280 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" containerID="cri-o://c51384ba3fd1325809de037e607e8f0a61c882e0bc397bc3a80b22ca6ab5ea08" gracePeriod=2 Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.562977 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.573148 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 11:54:23 crc kubenswrapper[4880]: I0218 11:54:23.778386 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:54:24 crc kubenswrapper[4880]: I0218 11:54:24.668724 4880 generic.go:334] "Generic (PLEG): container finished" podID="129989af-f3cc-44bf-9779-6688338d3130" containerID="c51384ba3fd1325809de037e607e8f0a61c882e0bc397bc3a80b22ca6ab5ea08" exitCode=0 Feb 18 11:54:24 crc kubenswrapper[4880]: I0218 11:54:24.668814 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6n78q" event={"ID":"129989af-f3cc-44bf-9779-6688338d3130","Type":"ContainerDied","Data":"c51384ba3fd1325809de037e607e8f0a61c882e0bc397bc3a80b22ca6ab5ea08"} Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.251808 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.252515 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fssnx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bsnd4_openshift-marketplace(40e949a6-d734-42e6-9423-8597f6d4c9de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.253733 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bsnd4" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.287451 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.287658 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4nk7z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6dl89_openshift-marketplace(0a68f0fc-e36e-4682-a3d4-3885ec4b473f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.288962 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6dl89" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.367531 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.368137 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j9b7d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bxlz8_openshift-marketplace(1f881593-be88-4dca-a7df-6b287588efbb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 11:54:32 crc kubenswrapper[4880]: E0218 11:54:32.369572 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bxlz8" podUID="1f881593-be88-4dca-a7df-6b287588efbb" Feb 18 11:54:32 crc kubenswrapper[4880]: I0218 11:54:32.656748 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nj7dq"] Feb 18 11:54:33 crc kubenswrapper[4880]: I0218 11:54:33.310146 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:33 crc kubenswrapper[4880]: I0218 11:54:33.310865 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:33 crc kubenswrapper[4880]: E0218 11:54:33.941905 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6dl89" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" Feb 18 11:54:33 crc kubenswrapper[4880]: E0218 11:54:33.941972 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bsnd4" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" Feb 18 11:54:34 crc kubenswrapper[4880]: I0218 11:54:34.582941 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tkr2r" Feb 18 11:54:34 crc kubenswrapper[4880]: I0218 11:54:34.739914 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" event={"ID":"c6981df1-6d75-41e2-a41e-ac960f0a847a","Type":"ContainerStarted","Data":"f2d32a025fb3353b613c5af682b5aee2b43734017f606d5eabd40e3bd73d59bc"} Feb 18 11:54:38 crc kubenswrapper[4880]: E0218 11:54:38.705417 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bxlz8" podUID="1f881593-be88-4dca-a7df-6b287588efbb" Feb 18 11:54:38 crc kubenswrapper[4880]: I0218 11:54:38.789747 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6n78q" event={"ID":"129989af-f3cc-44bf-9779-6688338d3130","Type":"ContainerStarted","Data":"f6a9b07d9ae320fa6a2903aec64c02dab1b3f4e40e2237592c80f136909acdfb"} Feb 18 11:54:38 crc kubenswrapper[4880]: I0218 11:54:38.791838 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:54:38 crc kubenswrapper[4880]: I0218 11:54:38.798041 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:38 crc kubenswrapper[4880]: I0218 11:54:38.798222 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.802468 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbfgr" event={"ID":"95755e2d-9c90-48ce-8141-7c209fd6d193","Type":"ContainerStarted","Data":"24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f"} Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.805603 4880 generic.go:334] "Generic (PLEG): container finished" podID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerID="acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314" exitCode=0 Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.805682 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rr7dq" event={"ID":"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7","Type":"ContainerDied","Data":"acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314"} Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.809763 4880 generic.go:334] "Generic (PLEG): container finished" podID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerID="9b62fec9d478e81fda8b5efcdc1488eb972c84979888f668693866238f17f1f3" exitCode=0 Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.809829 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpv8s" event={"ID":"1dd2f4dc-890d-485f-a29a-caba7ea13cde","Type":"ContainerDied","Data":"9b62fec9d478e81fda8b5efcdc1488eb972c84979888f668693866238f17f1f3"} Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.812458 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqgh2" event={"ID":"4166cca9-9e48-45b1-9751-e458a37c2e09","Type":"ContainerStarted","Data":"acbaa270debca5e7418153d68ff71bd6e93651a628151c091d932e8ff92aade1"} Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.814517 4880 generic.go:334] "Generic (PLEG): container finished" podID="e333b514-7367-498c-9660-500e02cfb188" containerID="daec6b250c4131128b4d0394905996015979024a9a0b57c557465abb935a7667" exitCode=0 Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.814678 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrsr" event={"ID":"e333b514-7367-498c-9660-500e02cfb188","Type":"ContainerDied","Data":"daec6b250c4131128b4d0394905996015979024a9a0b57c557465abb935a7667"} Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.820881 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" event={"ID":"c6981df1-6d75-41e2-a41e-ac960f0a847a","Type":"ContainerStarted","Data":"e1fe7d37593c5f8d17f7e91fca53caec5e784cc16e56140424c87aebb4bccaea"} Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.821990 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:39 crc kubenswrapper[4880]: I0218 11:54:39.822061 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:40 crc kubenswrapper[4880]: I0218 11:54:40.447676 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 11:54:40 crc kubenswrapper[4880]: I0218 11:54:40.831705 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nj7dq" event={"ID":"c6981df1-6d75-41e2-a41e-ac960f0a847a","Type":"ContainerStarted","Data":"cf7e33f0447693db625e19f2b736c00f47444e6abf3e7ac0d070ca379c23a85b"} Feb 18 11:54:40 crc kubenswrapper[4880]: I0218 11:54:40.833131 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:40 crc kubenswrapper[4880]: I0218 11:54:40.833184 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.027290 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nj7dq" podStartSLOduration=169.027257699 podStartE2EDuration="2m49.027257699s" podCreationTimestamp="2026-02-18 11:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:40.860767309 +0000 UTC m=+188.289668190" watchObservedRunningTime="2026-02-18 11:54:42.027257699 +0000 UTC m=+189.456158560" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.028988 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 11:54:42 crc kubenswrapper[4880]: E0218 11:54:42.029345 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9" containerName="pruner" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.029385 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9" containerName="pruner" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.029514 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b4e6e8-8433-44ad-aa08-3dd9ad5d95e9" containerName="pruner" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.030147 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.032476 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.034690 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.037885 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.086393 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514b7655-4cf7-42aa-b634-a43a20bb4300-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.086450 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514b7655-4cf7-42aa-b634-a43a20bb4300-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.187896 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514b7655-4cf7-42aa-b634-a43a20bb4300-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.187968 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514b7655-4cf7-42aa-b634-a43a20bb4300-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.188117 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514b7655-4cf7-42aa-b634-a43a20bb4300-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.225571 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514b7655-4cf7-42aa-b634-a43a20bb4300-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.355992 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9tqb6"] Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.411245 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.851426 4880 generic.go:334] "Generic (PLEG): container finished" podID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerID="24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f" exitCode=0 Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.851494 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbfgr" event={"ID":"95755e2d-9c90-48ce-8141-7c209fd6d193","Type":"ContainerDied","Data":"24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f"} Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.858099 4880 generic.go:334] "Generic (PLEG): container finished" podID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerID="acbaa270debca5e7418153d68ff71bd6e93651a628151c091d932e8ff92aade1" exitCode=0 Feb 18 11:54:42 crc kubenswrapper[4880]: I0218 11:54:42.858158 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqgh2" event={"ID":"4166cca9-9e48-45b1-9751-e458a37c2e09","Type":"ContainerDied","Data":"acbaa270debca5e7418153d68ff71bd6e93651a628151c091d932e8ff92aade1"} Feb 18 11:54:43 crc kubenswrapper[4880]: I0218 11:54:43.308581 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:43 crc kubenswrapper[4880]: I0218 11:54:43.308729 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:43 crc kubenswrapper[4880]: I0218 11:54:43.308919 4880 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n78q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 18 11:54:43 crc kubenswrapper[4880]: I0218 11:54:43.308991 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6n78q" podUID="129989af-f3cc-44bf-9779-6688338d3130" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 18 11:54:43 crc kubenswrapper[4880]: I0218 11:54:43.752100 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 11:54:43 crc kubenswrapper[4880]: W0218 11:54:43.759291 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod514b7655_4cf7_42aa_b634_a43a20bb4300.slice/crio-70b4ba04c9879d8dac55e78213cb1cf5c0a8f4451c4d37aa62d8a005cfdfdf56 WatchSource:0}: Error finding container 70b4ba04c9879d8dac55e78213cb1cf5c0a8f4451c4d37aa62d8a005cfdfdf56: Status 404 returned error can't find the container with id 70b4ba04c9879d8dac55e78213cb1cf5c0a8f4451c4d37aa62d8a005cfdfdf56 Feb 18 11:54:43 crc kubenswrapper[4880]: I0218 11:54:43.868952 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"514b7655-4cf7-42aa-b634-a43a20bb4300","Type":"ContainerStarted","Data":"70b4ba04c9879d8dac55e78213cb1cf5c0a8f4451c4d37aa62d8a005cfdfdf56"} Feb 18 11:54:44 crc kubenswrapper[4880]: I0218 11:54:44.878833 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"514b7655-4cf7-42aa-b634-a43a20bb4300","Type":"ContainerStarted","Data":"3114af267f2ffe50c1842ca32aaec2e5dbe18a81acd98af79aad89365f1a7e49"} Feb 18 11:54:44 crc kubenswrapper[4880]: I0218 11:54:44.881806 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrsr" event={"ID":"e333b514-7367-498c-9660-500e02cfb188","Type":"ContainerStarted","Data":"10cea74274f2ef4c50e3156484da5a945dd7813e9e9d19605c5920c5ff2b887b"} Feb 18 11:54:44 crc kubenswrapper[4880]: I0218 11:54:44.908284 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5zrsr" podStartSLOduration=4.002495276 podStartE2EDuration="42.908255862s" podCreationTimestamp="2026-02-18 11:54:02 +0000 UTC" firstStartedPulling="2026-02-18 11:54:04.31856446 +0000 UTC m=+151.747465321" lastFinishedPulling="2026-02-18 11:54:43.224325046 +0000 UTC m=+190.653225907" observedRunningTime="2026-02-18 11:54:44.900845917 +0000 UTC m=+192.329746788" watchObservedRunningTime="2026-02-18 11:54:44.908255862 +0000 UTC m=+192.337156723" Feb 18 11:54:45 crc kubenswrapper[4880]: I0218 11:54:45.913199 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.91317009 podStartE2EDuration="3.91317009s" podCreationTimestamp="2026-02-18 11:54:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:45.910370833 +0000 UTC m=+193.339271694" watchObservedRunningTime="2026-02-18 11:54:45.91317009 +0000 UTC m=+193.342070951" Feb 18 11:54:46 crc kubenswrapper[4880]: I0218 11:54:46.902144 4880 generic.go:334] "Generic (PLEG): container finished" podID="514b7655-4cf7-42aa-b634-a43a20bb4300" containerID="3114af267f2ffe50c1842ca32aaec2e5dbe18a81acd98af79aad89365f1a7e49" exitCode=0 Feb 18 11:54:46 crc kubenswrapper[4880]: I0218 11:54:46.902229 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"514b7655-4cf7-42aa-b634-a43a20bb4300","Type":"ContainerDied","Data":"3114af267f2ffe50c1842ca32aaec2e5dbe18a81acd98af79aad89365f1a7e49"} Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.227084 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.228409 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.239837 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.279593 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.279692 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kube-api-access\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.279772 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-var-lock\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.384374 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-var-lock\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.384469 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.384487 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kube-api-access\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.384622 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-var-lock\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.384749 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.426692 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kube-api-access\") pod \"installer-9-crc\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:47 crc kubenswrapper[4880]: I0218 11:54:47.564490 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.300351 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.399342 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514b7655-4cf7-42aa-b634-a43a20bb4300-kube-api-access\") pod \"514b7655-4cf7-42aa-b634-a43a20bb4300\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.399469 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514b7655-4cf7-42aa-b634-a43a20bb4300-kubelet-dir\") pod \"514b7655-4cf7-42aa-b634-a43a20bb4300\" (UID: \"514b7655-4cf7-42aa-b634-a43a20bb4300\") " Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.399589 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/514b7655-4cf7-42aa-b634-a43a20bb4300-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "514b7655-4cf7-42aa-b634-a43a20bb4300" (UID: "514b7655-4cf7-42aa-b634-a43a20bb4300"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.399922 4880 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/514b7655-4cf7-42aa-b634-a43a20bb4300-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.405993 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514b7655-4cf7-42aa-b634-a43a20bb4300-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "514b7655-4cf7-42aa-b634-a43a20bb4300" (UID: "514b7655-4cf7-42aa-b634-a43a20bb4300"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.501664 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/514b7655-4cf7-42aa-b634-a43a20bb4300-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.929567 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"514b7655-4cf7-42aa-b634-a43a20bb4300","Type":"ContainerDied","Data":"70b4ba04c9879d8dac55e78213cb1cf5c0a8f4451c4d37aa62d8a005cfdfdf56"} Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.929639 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70b4ba04c9879d8dac55e78213cb1cf5c0a8f4451c4d37aa62d8a005cfdfdf56" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.929704 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.933007 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpv8s" event={"ID":"1dd2f4dc-890d-485f-a29a-caba7ea13cde","Type":"ContainerStarted","Data":"bbb811994d0f61acdb720082eabdc019a1b24ba1229e8f3e48fee60cbe821fa0"} Feb 18 11:54:48 crc kubenswrapper[4880]: I0218 11:54:48.956391 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xpv8s" podStartSLOduration=5.786568029 podStartE2EDuration="43.956366641s" podCreationTimestamp="2026-02-18 11:54:05 +0000 UTC" firstStartedPulling="2026-02-18 11:54:08.497913515 +0000 UTC m=+155.926814376" lastFinishedPulling="2026-02-18 11:54:46.667712127 +0000 UTC m=+194.096612988" observedRunningTime="2026-02-18 11:54:48.953802688 +0000 UTC m=+196.382703559" watchObservedRunningTime="2026-02-18 11:54:48.956366641 +0000 UTC m=+196.385267502" Feb 18 11:54:50 crc kubenswrapper[4880]: I0218 11:54:50.548887 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 11:54:50 crc kubenswrapper[4880]: W0218 11:54:50.557734 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b2631ac_d27b_47f0_928a_aff9b9f00d30.slice/crio-f6611ce2603cb6c0e37d1f592dfb96f79c1a9fe655c5985628173bc9e8ec4e04 WatchSource:0}: Error finding container f6611ce2603cb6c0e37d1f592dfb96f79c1a9fe655c5985628173bc9e8ec4e04: Status 404 returned error can't find the container with id f6611ce2603cb6c0e37d1f592dfb96f79c1a9fe655c5985628173bc9e8ec4e04 Feb 18 11:54:50 crc kubenswrapper[4880]: I0218 11:54:50.955044 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0b2631ac-d27b-47f0-928a-aff9b9f00d30","Type":"ContainerStarted","Data":"f6611ce2603cb6c0e37d1f592dfb96f79c1a9fe655c5985628173bc9e8ec4e04"} Feb 18 11:54:51 crc kubenswrapper[4880]: I0218 11:54:51.962744 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0b2631ac-d27b-47f0-928a-aff9b9f00d30","Type":"ContainerStarted","Data":"f63ed125fb14a064b53b63567a6bc066a74647289f2f820907c9f1ced5adc14a"} Feb 18 11:54:51 crc kubenswrapper[4880]: I0218 11:54:51.965049 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rr7dq" event={"ID":"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7","Type":"ContainerStarted","Data":"0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6"} Feb 18 11:54:51 crc kubenswrapper[4880]: I0218 11:54:51.985461 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rr7dq" podStartSLOduration=5.319070415 podStartE2EDuration="47.985439322s" podCreationTimestamp="2026-02-18 11:54:04 +0000 UTC" firstStartedPulling="2026-02-18 11:54:07.462038952 +0000 UTC m=+154.890939813" lastFinishedPulling="2026-02-18 11:54:50.128407859 +0000 UTC m=+197.557308720" observedRunningTime="2026-02-18 11:54:51.982090557 +0000 UTC m=+199.410991448" watchObservedRunningTime="2026-02-18 11:54:51.985439322 +0000 UTC m=+199.414340183" Feb 18 11:54:52 crc kubenswrapper[4880]: I0218 11:54:52.912754 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:52 crc kubenswrapper[4880]: I0218 11:54:52.912892 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:53 crc kubenswrapper[4880]: I0218 11:54:53.001865 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.001834613 podStartE2EDuration="6.001834613s" podCreationTimestamp="2026-02-18 11:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:54:52.996911452 +0000 UTC m=+200.425812333" watchObservedRunningTime="2026-02-18 11:54:53.001834613 +0000 UTC m=+200.430735474" Feb 18 11:54:53 crc kubenswrapper[4880]: I0218 11:54:53.274296 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:54:53 crc kubenswrapper[4880]: I0218 11:54:53.274680 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:54:53 crc kubenswrapper[4880]: I0218 11:54:53.326994 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6n78q" Feb 18 11:54:53 crc kubenswrapper[4880]: I0218 11:54:53.541651 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:53 crc kubenswrapper[4880]: I0218 11:54:53.593697 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:54:55 crc kubenswrapper[4880]: I0218 11:54:55.081263 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:55 crc kubenswrapper[4880]: I0218 11:54:55.081427 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:55 crc kubenswrapper[4880]: I0218 11:54:55.120434 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:55 crc kubenswrapper[4880]: I0218 11:54:55.504303 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:55 crc kubenswrapper[4880]: I0218 11:54:55.504368 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:55 crc kubenswrapper[4880]: I0218 11:54:55.546795 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:56 crc kubenswrapper[4880]: I0218 11:54:56.022368 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:54:56 crc kubenswrapper[4880]: I0218 11:54:56.771201 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpv8s"] Feb 18 11:54:57 crc kubenswrapper[4880]: I0218 11:54:57.033389 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:54:57 crc kubenswrapper[4880]: I0218 11:54:57.995826 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xpv8s" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="registry-server" containerID="cri-o://bbb811994d0f61acdb720082eabdc019a1b24ba1229e8f3e48fee60cbe821fa0" gracePeriod=2 Feb 18 11:54:59 crc kubenswrapper[4880]: I0218 11:54:59.012350 4880 generic.go:334] "Generic (PLEG): container finished" podID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerID="bbb811994d0f61acdb720082eabdc019a1b24ba1229e8f3e48fee60cbe821fa0" exitCode=0 Feb 18 11:54:59 crc kubenswrapper[4880]: I0218 11:54:59.012467 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpv8s" event={"ID":"1dd2f4dc-890d-485f-a29a-caba7ea13cde","Type":"ContainerDied","Data":"bbb811994d0f61acdb720082eabdc019a1b24ba1229e8f3e48fee60cbe821fa0"} Feb 18 11:55:00 crc kubenswrapper[4880]: I0218 11:55:00.829750 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.014080 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-utilities\") pod \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.014209 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp4mz\" (UniqueName: \"kubernetes.io/projected/1dd2f4dc-890d-485f-a29a-caba7ea13cde-kube-api-access-xp4mz\") pod \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.014311 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-catalog-content\") pod \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\" (UID: \"1dd2f4dc-890d-485f-a29a-caba7ea13cde\") " Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.015663 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-utilities" (OuterVolumeSpecName: "utilities") pod "1dd2f4dc-890d-485f-a29a-caba7ea13cde" (UID: "1dd2f4dc-890d-485f-a29a-caba7ea13cde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.023464 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd2f4dc-890d-485f-a29a-caba7ea13cde-kube-api-access-xp4mz" (OuterVolumeSpecName: "kube-api-access-xp4mz") pod "1dd2f4dc-890d-485f-a29a-caba7ea13cde" (UID: "1dd2f4dc-890d-485f-a29a-caba7ea13cde"). InnerVolumeSpecName "kube-api-access-xp4mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.026872 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpv8s" event={"ID":"1dd2f4dc-890d-485f-a29a-caba7ea13cde","Type":"ContainerDied","Data":"0717aa4cf40db7c2c8211d31ad17004827d55ba7ee982c02b9cbb32511a9f1e4"} Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.026959 4880 scope.go:117] "RemoveContainer" containerID="bbb811994d0f61acdb720082eabdc019a1b24ba1229e8f3e48fee60cbe821fa0" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.027149 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpv8s" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.057982 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dd2f4dc-890d-485f-a29a-caba7ea13cde" (UID: "1dd2f4dc-890d-485f-a29a-caba7ea13cde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.115614 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp4mz\" (UniqueName: \"kubernetes.io/projected/1dd2f4dc-890d-485f-a29a-caba7ea13cde-kube-api-access-xp4mz\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.116038 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.116063 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd2f4dc-890d-485f-a29a-caba7ea13cde-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.349859 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpv8s"] Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.359303 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpv8s"] Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.680902 4880 scope.go:117] "RemoveContainer" containerID="9b62fec9d478e81fda8b5efcdc1488eb972c84979888f668693866238f17f1f3" Feb 18 11:55:01 crc kubenswrapper[4880]: I0218 11:55:01.826469 4880 scope.go:117] "RemoveContainer" containerID="61e46be63aec36f4ffdfa316e499442aa885dfd9a07fda914f6a5336aee06352" Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.045084 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbfgr" event={"ID":"95755e2d-9c90-48ce-8141-7c209fd6d193","Type":"ContainerStarted","Data":"59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703"} Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.046970 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxlz8" event={"ID":"1f881593-be88-4dca-a7df-6b287588efbb","Type":"ContainerStarted","Data":"01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46"} Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.049921 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqgh2" event={"ID":"4166cca9-9e48-45b1-9751-e458a37c2e09","Type":"ContainerStarted","Data":"82489c8d8b8819f2d19977e81c429b5cdaf0304a00bed9592a5346c900dd8239"} Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.053270 4880 generic.go:334] "Generic (PLEG): container finished" podID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerID="e50e2e35c15ec108485d909892bd52a7f0b252a0134319da39beadfc23e2e38e" exitCode=0 Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.053350 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnd4" event={"ID":"40e949a6-d734-42e6-9423-8597f6d4c9de","Type":"ContainerDied","Data":"e50e2e35c15ec108485d909892bd52a7f0b252a0134319da39beadfc23e2e38e"} Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.055486 4880 generic.go:334] "Generic (PLEG): container finished" podID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerID="7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4" exitCode=0 Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.055514 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dl89" event={"ID":"0a68f0fc-e36e-4682-a3d4-3885ec4b473f","Type":"ContainerDied","Data":"7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4"} Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.069182 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cbfgr" podStartSLOduration=4.962779757 podStartE2EDuration="58.069146144s" podCreationTimestamp="2026-02-18 11:54:05 +0000 UTC" firstStartedPulling="2026-02-18 11:54:08.496864627 +0000 UTC m=+155.925765488" lastFinishedPulling="2026-02-18 11:55:01.603231014 +0000 UTC m=+209.032131875" observedRunningTime="2026-02-18 11:55:03.068030713 +0000 UTC m=+210.496931584" watchObservedRunningTime="2026-02-18 11:55:03.069146144 +0000 UTC m=+210.498047005" Feb 18 11:55:03 crc kubenswrapper[4880]: I0218 11:55:03.188599 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" path="/var/lib/kubelet/pods/1dd2f4dc-890d-485f-a29a-caba7ea13cde/volumes" Feb 18 11:55:04 crc kubenswrapper[4880]: I0218 11:55:04.068346 4880 generic.go:334] "Generic (PLEG): container finished" podID="1f881593-be88-4dca-a7df-6b287588efbb" containerID="01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46" exitCode=0 Feb 18 11:55:04 crc kubenswrapper[4880]: I0218 11:55:04.068440 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxlz8" event={"ID":"1f881593-be88-4dca-a7df-6b287588efbb","Type":"ContainerDied","Data":"01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46"} Feb 18 11:55:04 crc kubenswrapper[4880]: I0218 11:55:04.095317 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bqgh2" podStartSLOduration=4.790953523 podStartE2EDuration="58.095285622s" podCreationTimestamp="2026-02-18 11:54:06 +0000 UTC" firstStartedPulling="2026-02-18 11:54:08.522139251 +0000 UTC m=+155.951040112" lastFinishedPulling="2026-02-18 11:55:01.82647135 +0000 UTC m=+209.255372211" observedRunningTime="2026-02-18 11:55:03.152221842 +0000 UTC m=+210.581122713" watchObservedRunningTime="2026-02-18 11:55:04.095285622 +0000 UTC m=+211.524186483" Feb 18 11:55:06 crc kubenswrapper[4880]: I0218 11:55:06.091933 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:55:06 crc kubenswrapper[4880]: I0218 11:55:06.093477 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:55:06 crc kubenswrapper[4880]: I0218 11:55:06.506801 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:55:06 crc kubenswrapper[4880]: I0218 11:55:06.506867 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:55:07 crc kubenswrapper[4880]: I0218 11:55:07.141917 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cbfgr" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="registry-server" probeResult="failure" output=< Feb 18 11:55:07 crc kubenswrapper[4880]: timeout: failed to connect service ":50051" within 1s Feb 18 11:55:07 crc kubenswrapper[4880]: > Feb 18 11:55:07 crc kubenswrapper[4880]: I0218 11:55:07.407516 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" podUID="5d4532b3-caee-400d-992c-023a97f6d0ca" containerName="oauth-openshift" containerID="cri-o://d1a6f2b5da434105602e8bffd1703b53607cf8e08b155728fa79f3a52f135ab3" gracePeriod=15 Feb 18 11:55:07 crc kubenswrapper[4880]: I0218 11:55:07.547486 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bqgh2" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="registry-server" probeResult="failure" output=< Feb 18 11:55:07 crc kubenswrapper[4880]: timeout: failed to connect service ":50051" within 1s Feb 18 11:55:07 crc kubenswrapper[4880]: > Feb 18 11:55:09 crc kubenswrapper[4880]: I0218 11:55:09.100067 4880 generic.go:334] "Generic (PLEG): container finished" podID="5d4532b3-caee-400d-992c-023a97f6d0ca" containerID="d1a6f2b5da434105602e8bffd1703b53607cf8e08b155728fa79f3a52f135ab3" exitCode=0 Feb 18 11:55:09 crc kubenswrapper[4880]: I0218 11:55:09.100141 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" event={"ID":"5d4532b3-caee-400d-992c-023a97f6d0ca","Type":"ContainerDied","Data":"d1a6f2b5da434105602e8bffd1703b53607cf8e08b155728fa79f3a52f135ab3"} Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.266531 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.317396 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm"] Feb 18 11:55:10 crc kubenswrapper[4880]: E0218 11:55:10.317823 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="extract-content" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.317845 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="extract-content" Feb 18 11:55:10 crc kubenswrapper[4880]: E0218 11:55:10.317863 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514b7655-4cf7-42aa-b634-a43a20bb4300" containerName="pruner" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.317873 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="514b7655-4cf7-42aa-b634-a43a20bb4300" containerName="pruner" Feb 18 11:55:10 crc kubenswrapper[4880]: E0218 11:55:10.317893 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4532b3-caee-400d-992c-023a97f6d0ca" containerName="oauth-openshift" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.317901 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4532b3-caee-400d-992c-023a97f6d0ca" containerName="oauth-openshift" Feb 18 11:55:10 crc kubenswrapper[4880]: E0218 11:55:10.317913 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="extract-utilities" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.317921 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="extract-utilities" Feb 18 11:55:10 crc kubenswrapper[4880]: E0218 11:55:10.317934 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="registry-server" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.317942 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="registry-server" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.318083 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd2f4dc-890d-485f-a29a-caba7ea13cde" containerName="registry-server" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.318104 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="514b7655-4cf7-42aa-b634-a43a20bb4300" containerName="pruner" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.318116 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4532b3-caee-400d-992c-023a97f6d0ca" containerName="oauth-openshift" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.318782 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.324397 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm"] Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.376779 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-provider-selection\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.376862 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-ocp-branding-template\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.376893 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-idp-0-file-data\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.376930 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-policies\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.376949 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-cliconfig\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377016 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-login\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377039 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-service-ca\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377065 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-serving-cert\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377090 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvtgq\" (UniqueName: \"kubernetes.io/projected/5d4532b3-caee-400d-992c-023a97f6d0ca-kube-api-access-jvtgq\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377136 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-error\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377155 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-dir\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377180 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-session\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377218 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-trusted-ca-bundle\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377262 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-router-certs\") pod \"5d4532b3-caee-400d-992c-023a97f6d0ca\" (UID: \"5d4532b3-caee-400d-992c-023a97f6d0ca\") " Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.377935 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.378506 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.378933 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379015 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379035 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379445 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379476 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379501 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gnnh\" (UniqueName: \"kubernetes.io/projected/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-kube-api-access-6gnnh\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379543 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379566 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379598 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379661 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379692 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379732 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379759 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379794 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379820 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379844 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379864 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379907 4880 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379919 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379929 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379940 4880 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d4532b3-caee-400d-992c-023a97f6d0ca-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.379955 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.387744 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.388128 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.388675 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.388778 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4532b3-caee-400d-992c-023a97f6d0ca-kube-api-access-jvtgq" (OuterVolumeSpecName: "kube-api-access-jvtgq") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "kube-api-access-jvtgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.389277 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.392015 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.393879 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.394210 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.394404 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5d4532b3-caee-400d-992c-023a97f6d0ca" (UID: "5d4532b3-caee-400d-992c-023a97f6d0ca"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.480969 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481034 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481066 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481094 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481121 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481148 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481188 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481209 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481231 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gnnh\" (UniqueName: \"kubernetes.io/projected/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-kube-api-access-6gnnh\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481248 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481267 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481301 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481334 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481359 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481376 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481427 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481438 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481450 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481460 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481472 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481484 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481494 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvtgq\" (UniqueName: \"kubernetes.io/projected/5d4532b3-caee-400d-992c-023a97f6d0ca-kube-api-access-jvtgq\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481504 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.481514 4880 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d4532b3-caee-400d-992c-023a97f6d0ca-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.482674 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.482979 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.483697 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.484381 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.486431 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.486451 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.486518 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.486953 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.487476 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.489793 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.489804 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.494065 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.501070 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gnnh\" (UniqueName: \"kubernetes.io/projected/4c46684f-5a8e-4243-9ba7-b2ec05a65db3-kube-api-access-6gnnh\") pod \"oauth-openshift-6c8d5d4f46-4htlm\" (UID: \"4c46684f-5a8e-4243-9ba7-b2ec05a65db3\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:10 crc kubenswrapper[4880]: I0218 11:55:10.636699 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:11 crc kubenswrapper[4880]: I0218 11:55:11.118540 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm"] Feb 18 11:55:11 crc kubenswrapper[4880]: I0218 11:55:11.120328 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" event={"ID":"5d4532b3-caee-400d-992c-023a97f6d0ca","Type":"ContainerDied","Data":"7e6704454cec8fadb22b128e07d5d9b74d2e54888a8274a3ae30f1fae24dd02b"} Feb 18 11:55:11 crc kubenswrapper[4880]: I0218 11:55:11.120383 4880 scope.go:117] "RemoveContainer" containerID="d1a6f2b5da434105602e8bffd1703b53607cf8e08b155728fa79f3a52f135ab3" Feb 18 11:55:11 crc kubenswrapper[4880]: I0218 11:55:11.120407 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9tqb6" Feb 18 11:55:11 crc kubenswrapper[4880]: I0218 11:55:11.160350 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9tqb6"] Feb 18 11:55:11 crc kubenswrapper[4880]: I0218 11:55:11.167599 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9tqb6"] Feb 18 11:55:11 crc kubenswrapper[4880]: I0218 11:55:11.188386 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4532b3-caee-400d-992c-023a97f6d0ca" path="/var/lib/kubelet/pods/5d4532b3-caee-400d-992c-023a97f6d0ca/volumes" Feb 18 11:55:12 crc kubenswrapper[4880]: I0218 11:55:12.136708 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnd4" event={"ID":"40e949a6-d734-42e6-9423-8597f6d4c9de","Type":"ContainerStarted","Data":"cbbf1a727305bab64e5c88bcbd1120444b4a5013cc7df5fff5e6257f64763856"} Feb 18 11:55:12 crc kubenswrapper[4880]: I0218 11:55:12.147487 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" event={"ID":"4c46684f-5a8e-4243-9ba7-b2ec05a65db3","Type":"ContainerStarted","Data":"819ac2b435ddc8affee19288a2ddb2e89e8dc1e685becfd2361d0cf35019515b"} Feb 18 11:55:12 crc kubenswrapper[4880]: I0218 11:55:12.147560 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" event={"ID":"4c46684f-5a8e-4243-9ba7-b2ec05a65db3","Type":"ContainerStarted","Data":"77e1da16af86b39e7e48cedc033d3cc6106b43b00f9c55c319e5ce8fd1686941"} Feb 18 11:55:12 crc kubenswrapper[4880]: I0218 11:55:12.157318 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bsnd4" podStartSLOduration=3.852706316 podStartE2EDuration="1m10.157293592s" podCreationTimestamp="2026-02-18 11:54:02 +0000 UTC" firstStartedPulling="2026-02-18 11:54:04.328782529 +0000 UTC m=+151.757683380" lastFinishedPulling="2026-02-18 11:55:10.633369785 +0000 UTC m=+218.062270656" observedRunningTime="2026-02-18 11:55:12.15616408 +0000 UTC m=+219.585064951" watchObservedRunningTime="2026-02-18 11:55:12.157293592 +0000 UTC m=+219.586194473" Feb 18 11:55:13 crc kubenswrapper[4880]: I0218 11:55:13.122576 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:55:13 crc kubenswrapper[4880]: I0218 11:55:13.122658 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:55:13 crc kubenswrapper[4880]: I0218 11:55:13.155585 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:13 crc kubenswrapper[4880]: I0218 11:55:13.163176 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" Feb 18 11:55:13 crc kubenswrapper[4880]: I0218 11:55:13.169583 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:55:13 crc kubenswrapper[4880]: I0218 11:55:13.184339 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-4htlm" podStartSLOduration=31.184311324 podStartE2EDuration="31.184311324s" podCreationTimestamp="2026-02-18 11:54:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:55:13.183707497 +0000 UTC m=+220.612608368" watchObservedRunningTime="2026-02-18 11:55:13.184311324 +0000 UTC m=+220.613212185" Feb 18 11:55:15 crc kubenswrapper[4880]: I0218 11:55:15.174643 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dl89" event={"ID":"0a68f0fc-e36e-4682-a3d4-3885ec4b473f","Type":"ContainerStarted","Data":"392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414"} Feb 18 11:55:16 crc kubenswrapper[4880]: I0218 11:55:16.133461 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:55:16 crc kubenswrapper[4880]: I0218 11:55:16.177579 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:55:16 crc kubenswrapper[4880]: I0218 11:55:16.224376 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6dl89" podStartSLOduration=3.99185145 podStartE2EDuration="1m13.224352737s" podCreationTimestamp="2026-02-18 11:54:03 +0000 UTC" firstStartedPulling="2026-02-18 11:54:04.336497067 +0000 UTC m=+151.765397928" lastFinishedPulling="2026-02-18 11:55:13.568998344 +0000 UTC m=+220.997899215" observedRunningTime="2026-02-18 11:55:16.221017522 +0000 UTC m=+223.649918393" watchObservedRunningTime="2026-02-18 11:55:16.224352737 +0000 UTC m=+223.653253598" Feb 18 11:55:16 crc kubenswrapper[4880]: I0218 11:55:16.546183 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:55:16 crc kubenswrapper[4880]: I0218 11:55:16.595005 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:55:18 crc kubenswrapper[4880]: I0218 11:55:18.658489 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bqgh2"] Feb 18 11:55:18 crc kubenswrapper[4880]: I0218 11:55:18.659435 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bqgh2" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="registry-server" containerID="cri-o://82489c8d8b8819f2d19977e81c429b5cdaf0304a00bed9592a5346c900dd8239" gracePeriod=2 Feb 18 11:55:19 crc kubenswrapper[4880]: I0218 11:55:19.198027 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxlz8" event={"ID":"1f881593-be88-4dca-a7df-6b287588efbb","Type":"ContainerStarted","Data":"a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2"} Feb 18 11:55:20 crc kubenswrapper[4880]: I0218 11:55:20.206557 4880 generic.go:334] "Generic (PLEG): container finished" podID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerID="82489c8d8b8819f2d19977e81c429b5cdaf0304a00bed9592a5346c900dd8239" exitCode=0 Feb 18 11:55:20 crc kubenswrapper[4880]: I0218 11:55:20.206644 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqgh2" event={"ID":"4166cca9-9e48-45b1-9751-e458a37c2e09","Type":"ContainerDied","Data":"82489c8d8b8819f2d19977e81c429b5cdaf0304a00bed9592a5346c900dd8239"} Feb 18 11:55:20 crc kubenswrapper[4880]: I0218 11:55:20.957743 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.134368 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-utilities\") pod \"4166cca9-9e48-45b1-9751-e458a37c2e09\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.134535 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmhzx\" (UniqueName: \"kubernetes.io/projected/4166cca9-9e48-45b1-9751-e458a37c2e09-kube-api-access-lmhzx\") pod \"4166cca9-9e48-45b1-9751-e458a37c2e09\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.134572 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-catalog-content\") pod \"4166cca9-9e48-45b1-9751-e458a37c2e09\" (UID: \"4166cca9-9e48-45b1-9751-e458a37c2e09\") " Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.135413 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-utilities" (OuterVolumeSpecName: "utilities") pod "4166cca9-9e48-45b1-9751-e458a37c2e09" (UID: "4166cca9-9e48-45b1-9751-e458a37c2e09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.141656 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4166cca9-9e48-45b1-9751-e458a37c2e09-kube-api-access-lmhzx" (OuterVolumeSpecName: "kube-api-access-lmhzx") pod "4166cca9-9e48-45b1-9751-e458a37c2e09" (UID: "4166cca9-9e48-45b1-9751-e458a37c2e09"). InnerVolumeSpecName "kube-api-access-lmhzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.219230 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqgh2" event={"ID":"4166cca9-9e48-45b1-9751-e458a37c2e09","Type":"ContainerDied","Data":"b99b2dd5da4b3bf2faca1400316389a4a49a3fd37431b2107446e326b804ab15"} Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.219320 4880 scope.go:117] "RemoveContainer" containerID="82489c8d8b8819f2d19977e81c429b5cdaf0304a00bed9592a5346c900dd8239" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.219325 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqgh2" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.236536 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmhzx\" (UniqueName: \"kubernetes.io/projected/4166cca9-9e48-45b1-9751-e458a37c2e09-kube-api-access-lmhzx\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.237036 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.240123 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bxlz8" podStartSLOduration=5.61674613 podStartE2EDuration="1m19.24009873s" podCreationTimestamp="2026-02-18 11:54:02 +0000 UTC" firstStartedPulling="2026-02-18 11:54:04.314216917 +0000 UTC m=+151.743117778" lastFinishedPulling="2026-02-18 11:55:17.937569517 +0000 UTC m=+225.366470378" observedRunningTime="2026-02-18 11:55:21.238255597 +0000 UTC m=+228.667156478" watchObservedRunningTime="2026-02-18 11:55:21.24009873 +0000 UTC m=+228.668999591" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.247484 4880 scope.go:117] "RemoveContainer" containerID="acbaa270debca5e7418153d68ff71bd6e93651a628151c091d932e8ff92aade1" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.260378 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4166cca9-9e48-45b1-9751-e458a37c2e09" (UID: "4166cca9-9e48-45b1-9751-e458a37c2e09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.270021 4880 scope.go:117] "RemoveContainer" containerID="c9ac54567a4aea59c2652cb48a4333d3778c982dd9149ce6f1c6cadf03bf0f42" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.340002 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4166cca9-9e48-45b1-9751-e458a37c2e09-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.546522 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bqgh2"] Feb 18 11:55:21 crc kubenswrapper[4880]: I0218 11:55:21.550217 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bqgh2"] Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.177082 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.212004 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" path="/var/lib/kubelet/pods/4166cca9-9e48-45b1-9751-e458a37c2e09/volumes" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.274820 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.275096 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.275170 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.276065 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82"} pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.276130 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" containerID="cri-o://ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82" gracePeriod=600 Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.320260 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.320338 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.367597 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.516185 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.516522 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:55:23 crc kubenswrapper[4880]: I0218 11:55:23.556136 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:55:24 crc kubenswrapper[4880]: I0218 11:55:24.317052 4880 generic.go:334] "Generic (PLEG): container finished" podID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerID="ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82" exitCode=0 Feb 18 11:55:24 crc kubenswrapper[4880]: I0218 11:55:24.317105 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerDied","Data":"ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82"} Feb 18 11:55:24 crc kubenswrapper[4880]: I0218 11:55:24.356506 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:55:24 crc kubenswrapper[4880]: I0218 11:55:24.360308 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:55:25 crc kubenswrapper[4880]: I0218 11:55:25.062687 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxlz8"] Feb 18 11:55:25 crc kubenswrapper[4880]: I0218 11:55:25.326970 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"60fb6b1e2c099ffdb6e6dd1164357f5a1d76a11dcc81d2d4ad9a7cfb6689e88e"} Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.335302 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bxlz8" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="registry-server" containerID="cri-o://a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2" gracePeriod=2 Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.465169 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6dl89"] Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.747532 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.827818 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9b7d\" (UniqueName: \"kubernetes.io/projected/1f881593-be88-4dca-a7df-6b287588efbb-kube-api-access-j9b7d\") pod \"1f881593-be88-4dca-a7df-6b287588efbb\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.827933 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-utilities\") pod \"1f881593-be88-4dca-a7df-6b287588efbb\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.827994 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-catalog-content\") pod \"1f881593-be88-4dca-a7df-6b287588efbb\" (UID: \"1f881593-be88-4dca-a7df-6b287588efbb\") " Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.829167 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-utilities" (OuterVolumeSpecName: "utilities") pod "1f881593-be88-4dca-a7df-6b287588efbb" (UID: "1f881593-be88-4dca-a7df-6b287588efbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.848119 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f881593-be88-4dca-a7df-6b287588efbb-kube-api-access-j9b7d" (OuterVolumeSpecName: "kube-api-access-j9b7d") pod "1f881593-be88-4dca-a7df-6b287588efbb" (UID: "1f881593-be88-4dca-a7df-6b287588efbb"). InnerVolumeSpecName "kube-api-access-j9b7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.888263 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f881593-be88-4dca-a7df-6b287588efbb" (UID: "1f881593-be88-4dca-a7df-6b287588efbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.929918 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.929987 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f881593-be88-4dca-a7df-6b287588efbb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:26 crc kubenswrapper[4880]: I0218 11:55:26.930006 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9b7d\" (UniqueName: \"kubernetes.io/projected/1f881593-be88-4dca-a7df-6b287588efbb-kube-api-access-j9b7d\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.343434 4880 generic.go:334] "Generic (PLEG): container finished" podID="1f881593-be88-4dca-a7df-6b287588efbb" containerID="a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2" exitCode=0 Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.343483 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxlz8" event={"ID":"1f881593-be88-4dca-a7df-6b287588efbb","Type":"ContainerDied","Data":"a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2"} Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.343546 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxlz8" event={"ID":"1f881593-be88-4dca-a7df-6b287588efbb","Type":"ContainerDied","Data":"756399d8096a590a0519ceafa8bb47545621deb9b23dff050e830109105ca2b5"} Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.343569 4880 scope.go:117] "RemoveContainer" containerID="a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.343581 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxlz8" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.343767 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6dl89" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="registry-server" containerID="cri-o://392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414" gracePeriod=2 Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.374064 4880 scope.go:117] "RemoveContainer" containerID="01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.379214 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxlz8"] Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.383519 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bxlz8"] Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.398382 4880 scope.go:117] "RemoveContainer" containerID="5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.424066 4880 scope.go:117] "RemoveContainer" containerID="a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2" Feb 18 11:55:27 crc kubenswrapper[4880]: E0218 11:55:27.424743 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2\": container with ID starting with a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2 not found: ID does not exist" containerID="a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.424781 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2"} err="failed to get container status \"a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2\": rpc error: code = NotFound desc = could not find container \"a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2\": container with ID starting with a9bd4af4fc335843a4a5aedde4e300397d8a91364a624bcd1a49198f6ae690d2 not found: ID does not exist" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.424811 4880 scope.go:117] "RemoveContainer" containerID="01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46" Feb 18 11:55:27 crc kubenswrapper[4880]: E0218 11:55:27.425176 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46\": container with ID starting with 01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46 not found: ID does not exist" containerID="01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.425294 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46"} err="failed to get container status \"01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46\": rpc error: code = NotFound desc = could not find container \"01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46\": container with ID starting with 01ed8724d91baa1ddef060583687bb3fc839343503d086543b47191df9713b46 not found: ID does not exist" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.425498 4880 scope.go:117] "RemoveContainer" containerID="5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580" Feb 18 11:55:27 crc kubenswrapper[4880]: E0218 11:55:27.426016 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580\": container with ID starting with 5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580 not found: ID does not exist" containerID="5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580" Feb 18 11:55:27 crc kubenswrapper[4880]: I0218 11:55:27.426052 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580"} err="failed to get container status \"5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580\": rpc error: code = NotFound desc = could not find container \"5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580\": container with ID starting with 5319c7cb7d81806e17c4453d346a6f910e5324271f0a74be2ea6a0b31d50a580 not found: ID does not exist" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.265774 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.352801 4880 generic.go:334] "Generic (PLEG): container finished" podID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerID="392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414" exitCode=0 Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.352872 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6dl89" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.352877 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dl89" event={"ID":"0a68f0fc-e36e-4682-a3d4-3885ec4b473f","Type":"ContainerDied","Data":"392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414"} Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.352948 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6dl89" event={"ID":"0a68f0fc-e36e-4682-a3d4-3885ec4b473f","Type":"ContainerDied","Data":"4d4479d7e597ba61b58ce60a6204e8204fd157b12b7b48abb254f3c71197cd5e"} Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.352984 4880 scope.go:117] "RemoveContainer" containerID="392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.354793 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-catalog-content\") pod \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.354838 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-utilities\") pod \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.355798 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nk7z\" (UniqueName: \"kubernetes.io/projected/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-kube-api-access-4nk7z\") pod \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\" (UID: \"0a68f0fc-e36e-4682-a3d4-3885ec4b473f\") " Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.355805 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-utilities" (OuterVolumeSpecName: "utilities") pod "0a68f0fc-e36e-4682-a3d4-3885ec4b473f" (UID: "0a68f0fc-e36e-4682-a3d4-3885ec4b473f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.359441 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.363815 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-kube-api-access-4nk7z" (OuterVolumeSpecName: "kube-api-access-4nk7z") pod "0a68f0fc-e36e-4682-a3d4-3885ec4b473f" (UID: "0a68f0fc-e36e-4682-a3d4-3885ec4b473f"). InnerVolumeSpecName "kube-api-access-4nk7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.373712 4880 scope.go:117] "RemoveContainer" containerID="7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.388545 4880 scope.go:117] "RemoveContainer" containerID="0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.402803 4880 scope.go:117] "RemoveContainer" containerID="392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414" Feb 18 11:55:28 crc kubenswrapper[4880]: E0218 11:55:28.403321 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414\": container with ID starting with 392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414 not found: ID does not exist" containerID="392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.403388 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414"} err="failed to get container status \"392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414\": rpc error: code = NotFound desc = could not find container \"392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414\": container with ID starting with 392c553aa63b60369a142230679e6ca06b4c64928940711eadc354fe1cf47414 not found: ID does not exist" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.403424 4880 scope.go:117] "RemoveContainer" containerID="7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4" Feb 18 11:55:28 crc kubenswrapper[4880]: E0218 11:55:28.403858 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4\": container with ID starting with 7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4 not found: ID does not exist" containerID="7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.403903 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4"} err="failed to get container status \"7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4\": rpc error: code = NotFound desc = could not find container \"7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4\": container with ID starting with 7d1f712e4c4ecbfa34844917e09a03bb35db9fb895b469aced30e40b0e2ff2a4 not found: ID does not exist" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.403927 4880 scope.go:117] "RemoveContainer" containerID="0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a" Feb 18 11:55:28 crc kubenswrapper[4880]: E0218 11:55:28.404217 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a\": container with ID starting with 0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a not found: ID does not exist" containerID="0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.404251 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a"} err="failed to get container status \"0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a\": rpc error: code = NotFound desc = could not find container \"0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a\": container with ID starting with 0b105e9b44dc1752942addc4bfdcf490c9a2e1e68b2de23cb864915333b7138a not found: ID does not exist" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.412388 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a68f0fc-e36e-4682-a3d4-3885ec4b473f" (UID: "0a68f0fc-e36e-4682-a3d4-3885ec4b473f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.461505 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nk7z\" (UniqueName: \"kubernetes.io/projected/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-kube-api-access-4nk7z\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.461560 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a68f0fc-e36e-4682-a3d4-3885ec4b473f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.694322 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6dl89"] Feb 18 11:55:28 crc kubenswrapper[4880]: I0218 11:55:28.698024 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6dl89"] Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021231 4880 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021547 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021560 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021569 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="extract-content" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021575 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="extract-content" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021584 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021591 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021635 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="extract-utilities" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021642 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="extract-utilities" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021650 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021657 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021666 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="extract-content" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021672 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="extract-content" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021681 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="extract-utilities" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021688 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="extract-utilities" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021696 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="extract-content" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021703 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="extract-content" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.021714 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="extract-utilities" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021719 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="extract-utilities" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021836 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f881593-be88-4dca-a7df-6b287588efbb" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021847 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="4166cca9-9e48-45b1-9751-e458a37c2e09" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.021856 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" containerName="registry-server" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.022291 4880 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.022626 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9" gracePeriod=15 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.022741 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.022741 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd" gracePeriod=15 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.022758 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef" gracePeriod=15 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.022821 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4" gracePeriod=15 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.022859 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460" gracePeriod=15 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.025082 4880 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.025349 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.025368 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.025382 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.025421 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.025434 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.025440 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.026404 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026427 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.026443 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026452 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.026464 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026472 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.026482 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026493 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026688 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026707 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026717 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026728 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026738 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.026976 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.086111 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171320 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171369 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171399 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171450 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171492 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171563 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171628 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.171656 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.187306 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a68f0fc-e36e-4682-a3d4-3885ec4b473f" path="/var/lib/kubelet/pods/0a68f0fc-e36e-4682-a3d4-3885ec4b473f/volumes" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.188027 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f881593-be88-4dca-a7df-6b287588efbb" path="/var/lib/kubelet/pods/1f881593-be88-4dca-a7df-6b287588efbb/volumes" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273474 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273594 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273689 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273718 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273740 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273767 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273771 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273800 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273795 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273836 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273879 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.273906 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.274073 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.274187 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.274718 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.274784 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.364994 4880 generic.go:334] "Generic (PLEG): container finished" podID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" containerID="f63ed125fb14a064b53b63567a6bc066a74647289f2f820907c9f1ced5adc14a" exitCode=0 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.365090 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0b2631ac-d27b-47f0-928a-aff9b9f00d30","Type":"ContainerDied","Data":"f63ed125fb14a064b53b63567a6bc066a74647289f2f820907c9f1ced5adc14a"} Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.366055 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.366347 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.367843 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.368999 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.369794 4880 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd" exitCode=0 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.369820 4880 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4" exitCode=0 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.369828 4880 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef" exitCode=0 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.369837 4880 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460" exitCode=2 Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.369884 4880 scope.go:117] "RemoveContainer" containerID="c0304accf71ebef0c16b05d37bfa6e4428e8dd5eeb488f303a028b94af561b6e" Feb 18 11:55:29 crc kubenswrapper[4880]: I0218 11:55:29.386923 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:55:29 crc kubenswrapper[4880]: E0218 11:55:29.423711 4880 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895553875ce3947 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 11:55:29.423137095 +0000 UTC m=+236.852037956,LastTimestamp:2026-02-18 11:55:29.423137095 +0000 UTC m=+236.852037956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.379118 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.382264 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b657f6f3a79bfe1e09ccb4cf97d1635aa2febec0c8c4049551c4f1537e05c526"} Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.382352 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d5b73c514290f4f880cc8e21ea4b0c51bd71e537f6d3ca4dfa5544b0c7b6992f"} Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.383071 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.383441 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.622971 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.624156 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.624690 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.802071 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kube-api-access\") pod \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.802206 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kubelet-dir\") pod \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.802271 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-var-lock\") pod \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\" (UID: \"0b2631ac-d27b-47f0-928a-aff9b9f00d30\") " Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.802303 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b2631ac-d27b-47f0-928a-aff9b9f00d30" (UID: "0b2631ac-d27b-47f0-928a-aff9b9f00d30"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.802385 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-var-lock" (OuterVolumeSpecName: "var-lock") pod "0b2631ac-d27b-47f0-928a-aff9b9f00d30" (UID: "0b2631ac-d27b-47f0-928a-aff9b9f00d30"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.802575 4880 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.802592 4880 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b2631ac-d27b-47f0-928a-aff9b9f00d30-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.808999 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b2631ac-d27b-47f0-928a-aff9b9f00d30" (UID: "0b2631ac-d27b-47f0-928a-aff9b9f00d30"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:55:30 crc kubenswrapper[4880]: I0218 11:55:30.904535 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b2631ac-d27b-47f0-928a-aff9b9f00d30-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.391239 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.391251 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0b2631ac-d27b-47f0-928a-aff9b9f00d30","Type":"ContainerDied","Data":"f6611ce2603cb6c0e37d1f592dfb96f79c1a9fe655c5985628173bc9e8ec4e04"} Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.392012 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6611ce2603cb6c0e37d1f592dfb96f79c1a9fe655c5985628173bc9e8ec4e04" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.476508 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.476789 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.512447 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.513323 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.514130 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.514674 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.515067 4880 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.617835 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.617926 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.617994 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.617996 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.618074 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.618160 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.618288 4880 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.618301 4880 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:31 crc kubenswrapper[4880]: I0218 11:55:31.618309 4880 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.404451 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.405540 4880 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9" exitCode=0 Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.405834 4880 scope.go:117] "RemoveContainer" containerID="187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.406050 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.431503 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.432203 4880 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.432698 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.437226 4880 scope.go:117] "RemoveContainer" containerID="d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.452000 4880 scope.go:117] "RemoveContainer" containerID="da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.469800 4880 scope.go:117] "RemoveContainer" containerID="ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.485419 4880 scope.go:117] "RemoveContainer" containerID="6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.506826 4880 scope.go:117] "RemoveContainer" containerID="58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.532889 4880 scope.go:117] "RemoveContainer" containerID="187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.535455 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\": container with ID starting with 187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd not found: ID does not exist" containerID="187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.535501 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd"} err="failed to get container status \"187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\": rpc error: code = NotFound desc = could not find container \"187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd\": container with ID starting with 187077f2ac8b33146ca4199d9f153d65042f6edf9a88b13aa6fa10d8a0aa82bd not found: ID does not exist" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.535571 4880 scope.go:117] "RemoveContainer" containerID="d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.536015 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\": container with ID starting with d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4 not found: ID does not exist" containerID="d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.536078 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4"} err="failed to get container status \"d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\": rpc error: code = NotFound desc = could not find container \"d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4\": container with ID starting with d1757cd026f162090619c256c2cefc2b5673befe6f02305cb4d0eb7aafda48c4 not found: ID does not exist" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.536099 4880 scope.go:117] "RemoveContainer" containerID="da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.536678 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\": container with ID starting with da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef not found: ID does not exist" containerID="da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.536741 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef"} err="failed to get container status \"da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\": rpc error: code = NotFound desc = could not find container \"da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef\": container with ID starting with da62f7643321905da5e870262a17149d1fc2a28e4ceb5ebf24f18a8e1613dbef not found: ID does not exist" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.536767 4880 scope.go:117] "RemoveContainer" containerID="ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.537308 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\": container with ID starting with ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460 not found: ID does not exist" containerID="ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.537357 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460"} err="failed to get container status \"ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\": rpc error: code = NotFound desc = could not find container \"ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460\": container with ID starting with ecc7a0275caf3252d57185583f0b2bb1f890b190b780b7fa926f52a856571460 not found: ID does not exist" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.537396 4880 scope.go:117] "RemoveContainer" containerID="6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.538914 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\": container with ID starting with 6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9 not found: ID does not exist" containerID="6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.538940 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9"} err="failed to get container status \"6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\": rpc error: code = NotFound desc = could not find container \"6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9\": container with ID starting with 6b99e08d7df5b147537248158c8c62b2cbcbd05102ccca27087dd958bfd6bba9 not found: ID does not exist" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.538994 4880 scope.go:117] "RemoveContainer" containerID="58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.539339 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\": container with ID starting with 58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd not found: ID does not exist" containerID="58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.539360 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd"} err="failed to get container status \"58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\": rpc error: code = NotFound desc = could not find container \"58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd\": container with ID starting with 58ee3639e3cb6b49bf80a608a71471e0c806582f37230a91f5a7386617da5efd not found: ID does not exist" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.638578 4880 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.638881 4880 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.639107 4880 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.639312 4880 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.639525 4880 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:32 crc kubenswrapper[4880]: I0218 11:55:32.639550 4880 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.639828 4880 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="200ms" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.841277 4880 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Feb 18 11:55:32 crc kubenswrapper[4880]: E0218 11:55:32.948056 4880 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895553875ce3947 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 11:55:29.423137095 +0000 UTC m=+236.852037956,LastTimestamp:2026-02-18 11:55:29.423137095 +0000 UTC m=+236.852037956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 11:55:33 crc kubenswrapper[4880]: I0218 11:55:33.184448 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:33 crc kubenswrapper[4880]: I0218 11:55:33.184899 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:33 crc kubenswrapper[4880]: I0218 11:55:33.185673 4880 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:33 crc kubenswrapper[4880]: I0218 11:55:33.186880 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 18 11:55:33 crc kubenswrapper[4880]: E0218 11:55:33.242563 4880 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Feb 18 11:55:34 crc kubenswrapper[4880]: E0218 11:55:34.043902 4880 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Feb 18 11:55:35 crc kubenswrapper[4880]: E0218 11:55:35.648673 4880 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Feb 18 11:55:38 crc kubenswrapper[4880]: E0218 11:55:38.852165 4880 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="6.4s" Feb 18 11:55:42 crc kubenswrapper[4880]: E0218 11:55:42.949187 4880 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895553875ce3947 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 11:55:29.423137095 +0000 UTC m=+236.852037956,LastTimestamp:2026-02-18 11:55:29.423137095 +0000 UTC m=+236.852037956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 11:55:43 crc kubenswrapper[4880]: I0218 11:55:43.182701 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:43 crc kubenswrapper[4880]: I0218 11:55:43.183083 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:43 crc kubenswrapper[4880]: I0218 11:55:43.941745 4880 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 18 11:55:43 crc kubenswrapper[4880]: I0218 11:55:43.941845 4880 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.179283 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.180554 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.181421 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.193894 4880 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.193951 4880 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:44 crc kubenswrapper[4880]: E0218 11:55:44.194582 4880 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.195441 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:44 crc kubenswrapper[4880]: W0218 11:55:44.224785 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-887ffdc828e60ad1bb44561f327dce38d48ea4aedce8c7247bf20c1d5ea71a77 WatchSource:0}: Error finding container 887ffdc828e60ad1bb44561f327dce38d48ea4aedce8c7247bf20c1d5ea71a77: Status 404 returned error can't find the container with id 887ffdc828e60ad1bb44561f327dce38d48ea4aedce8c7247bf20c1d5ea71a77 Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.494060 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.494578 4880 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911" exitCode=1 Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.494705 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911"} Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.495321 4880 scope.go:117] "RemoveContainer" containerID="fe1951108f1c384f95c39b38354e3c9195e8dd3e687be3e102d6c7ccdd383911" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.495600 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.496288 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"887ffdc828e60ad1bb44561f327dce38d48ea4aedce8c7247bf20c1d5ea71a77"} Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.496546 4880 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:44 crc kubenswrapper[4880]: I0218 11:55:44.496950 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:45 crc kubenswrapper[4880]: E0218 11:55:45.253548 4880 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="7s" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.441324 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.506540 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.506660 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"48ab61ad052125fca879d243866d7acbafe4f86875675a710d6c7259966b3318"} Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.510684 4880 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="11cfd77d84f93485f6cea3fef38d64f56d0161ce6304cf17836387986bd7bd0a" exitCode=0 Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.510950 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"11cfd77d84f93485f6cea3fef38d64f56d0161ce6304cf17836387986bd7bd0a"} Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.513097 4880 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.513185 4880 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.514382 4880 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.515055 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.515366 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:45 crc kubenswrapper[4880]: E0218 11:55:45.515458 4880 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.515819 4880 status_manager.go:851] "Failed to get status for pod" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.516080 4880 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:45 crc kubenswrapper[4880]: I0218 11:55:45.516352 4880 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 18 11:55:46 crc kubenswrapper[4880]: I0218 11:55:46.519448 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"abba9ba7a067b142b18ced8cd0b4c2c638c0d2de00a54083d7989c11acacc020"} Feb 18 11:55:46 crc kubenswrapper[4880]: I0218 11:55:46.520044 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ff099552e71fdb2ddba0c25e0320364aec7c799a006d64a709f758b6ca716b37"} Feb 18 11:55:47 crc kubenswrapper[4880]: I0218 11:55:47.530233 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80b849c4faca74adfa0639a3e49502b33d62630cceceaee701b51e6a9dcb9c31"} Feb 18 11:55:47 crc kubenswrapper[4880]: I0218 11:55:47.530710 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"20f41830808679af5bf919ef5e425a91a02991684084694bdec434c001cdd918"} Feb 18 11:55:47 crc kubenswrapper[4880]: I0218 11:55:47.530727 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93815d7a1499672c4e7db41882ef65dc9bb4adb9c9e1e09438eff1fbd70d07ed"} Feb 18 11:55:47 crc kubenswrapper[4880]: I0218 11:55:47.530802 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:47 crc kubenswrapper[4880]: I0218 11:55:47.530875 4880 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:47 crc kubenswrapper[4880]: I0218 11:55:47.530914 4880 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:49 crc kubenswrapper[4880]: I0218 11:55:49.196416 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:49 crc kubenswrapper[4880]: I0218 11:55:49.196477 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:49 crc kubenswrapper[4880]: I0218 11:55:49.202982 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:52 crc kubenswrapper[4880]: I0218 11:55:52.187172 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:55:52 crc kubenswrapper[4880]: I0218 11:55:52.191429 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:55:52 crc kubenswrapper[4880]: I0218 11:55:52.563887 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:55:52 crc kubenswrapper[4880]: I0218 11:55:52.699663 4880 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:55:53 crc kubenswrapper[4880]: I0218 11:55:53.194400 4880 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bef432ac-9411-494f-9b78-5799b54408cc" Feb 18 11:55:53 crc kubenswrapper[4880]: I0218 11:55:53.570675 4880 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:53 crc kubenswrapper[4880]: I0218 11:55:53.571016 4880 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7baa9e97-9a09-417f-ba64-82e5e5f4276d" Feb 18 11:55:53 crc kubenswrapper[4880]: I0218 11:55:53.574303 4880 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bef432ac-9411-494f-9b78-5799b54408cc" Feb 18 11:56:02 crc kubenswrapper[4880]: I0218 11:56:02.894779 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.026909 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.309884 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.441694 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.564371 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.685380 4880 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.790069 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.883099 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.909782 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 11:56:03 crc kubenswrapper[4880]: I0218 11:56:03.945943 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 11:56:04 crc kubenswrapper[4880]: I0218 11:56:04.109049 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 11:56:04 crc kubenswrapper[4880]: I0218 11:56:04.419852 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 11:56:04 crc kubenswrapper[4880]: I0218 11:56:04.724377 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 11:56:04 crc kubenswrapper[4880]: I0218 11:56:04.756177 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.022981 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.044446 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.156626 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.283915 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.333227 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.345822 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.894290 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.898162 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.904524 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.946548 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.949673 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.950693 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.983573 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 11:56:05 crc kubenswrapper[4880]: I0218 11:56:05.995697 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.041814 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.051754 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.074512 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.132755 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.133636 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.145965 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.198021 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.442952 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.464102 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.474170 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.518237 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.521721 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.611088 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.689785 4880 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.690430 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.690406991 podStartE2EDuration="37.690406991s" podCreationTimestamp="2026-02-18 11:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:55:52.768127346 +0000 UTC m=+260.197028207" watchObservedRunningTime="2026-02-18 11:56:06.690406991 +0000 UTC m=+274.119307852" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.694690 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.694751 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.704226 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.704985 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.707130 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.712686 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.730829 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.732396 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.732371653 podStartE2EDuration="14.732371653s" podCreationTimestamp="2026-02-18 11:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:06.731385994 +0000 UTC m=+274.160286865" watchObservedRunningTime="2026-02-18 11:56:06.732371653 +0000 UTC m=+274.161272514" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.737171 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.769760 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.793457 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.973746 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 11:56:06 crc kubenswrapper[4880]: I0218 11:56:06.995234 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.076051 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.304586 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.449228 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.463416 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.501956 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.526594 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.545590 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.579770 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.602759 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.656307 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.658194 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.671450 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.691077 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.760965 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.777795 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.825754 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.833297 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.894603 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.955425 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.959494 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 11:56:07 crc kubenswrapper[4880]: I0218 11:56:07.992897 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.000349 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.036258 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.037423 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.126331 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.130721 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.140300 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.243153 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.290914 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.371867 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.398881 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.470471 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.498273 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.586573 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.586772 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.606387 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.610390 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.769284 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.877552 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.879669 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.909825 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.932699 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.976283 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 11:56:08 crc kubenswrapper[4880]: I0218 11:56:08.998575 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.016159 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.246571 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.254556 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.257439 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.290359 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.316483 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.495416 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.517859 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.546993 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.675504 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.718651 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.875531 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.877847 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.946687 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.988251 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 11:56:09 crc kubenswrapper[4880]: I0218 11:56:09.993640 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.003284 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.106774 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.125672 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.150764 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.200428 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.233012 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.320561 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.425423 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.503124 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.581293 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.589797 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.664199 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.875831 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 11:56:10 crc kubenswrapper[4880]: I0218 11:56:10.931590 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.037690 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.049366 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.096797 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.205579 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.320906 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.332389 4880 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.337737 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.384640 4880 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.388161 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.522706 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.631593 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.781991 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.797300 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.833077 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.877950 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 11:56:11 crc kubenswrapper[4880]: I0218 11:56:11.977963 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.037954 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.114766 4880 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.117433 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.129544 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.143490 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.156276 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.216110 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.229471 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.331565 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.353113 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.459294 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.470250 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.470259 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.500896 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.524273 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.615305 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.634182 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.754743 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.759372 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.882600 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 11:56:12 crc kubenswrapper[4880]: I0218 11:56:12.885583 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.012113 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.054709 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.187864 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.386522 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.409982 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.414219 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.476172 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.529472 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.658309 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.679201 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.698547 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.707924 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.746469 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.847749 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 11:56:13 crc kubenswrapper[4880]: I0218 11:56:13.905148 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.061172 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.071221 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.076825 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.095805 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.143364 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.210528 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.240323 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.611975 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.646658 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.849850 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.895725 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.964669 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 11:56:14 crc kubenswrapper[4880]: I0218 11:56:14.986234 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.000486 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.016474 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.092639 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.127269 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.323361 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.441418 4880 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.442202 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b657f6f3a79bfe1e09ccb4cf97d1635aa2febec0c8c4049551c4f1537e05c526" gracePeriod=5 Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.445737 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.542792 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.811369 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.828515 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:56:15 crc kubenswrapper[4880]: I0218 11:56:15.932169 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.028496 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.052535 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.119149 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.123802 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.127626 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.150427 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.173446 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.174147 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.247462 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.279040 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.358140 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.376936 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.447808 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.451745 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.566726 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.754835 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.928131 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.936131 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.961218 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 11:56:16 crc kubenswrapper[4880]: I0218 11:56:16.981017 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 11:56:17 crc kubenswrapper[4880]: I0218 11:56:17.113147 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:56:17 crc kubenswrapper[4880]: I0218 11:56:17.193198 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 11:56:17 crc kubenswrapper[4880]: I0218 11:56:17.211554 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 11:56:17 crc kubenswrapper[4880]: I0218 11:56:17.424870 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 11:56:17 crc kubenswrapper[4880]: I0218 11:56:17.681872 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 11:56:17 crc kubenswrapper[4880]: I0218 11:56:17.808780 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 11:56:17 crc kubenswrapper[4880]: I0218 11:56:17.878726 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.052692 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.073882 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.093138 4880 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.172883 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.263655 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.376305 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.473295 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 11:56:18 crc kubenswrapper[4880]: I0218 11:56:18.910587 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 11:56:20 crc kubenswrapper[4880]: I0218 11:56:20.730181 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 11:56:20 crc kubenswrapper[4880]: I0218 11:56:20.730563 4880 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b657f6f3a79bfe1e09ccb4cf97d1635aa2febec0c8c4049551c4f1537e05c526" exitCode=137 Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.044044 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.044137 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.150964 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151060 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151117 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151132 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151122 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151167 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151195 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151221 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151317 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151453 4880 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151466 4880 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151475 4880 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.151482 4880 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.161975 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.186475 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.186745 4880 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.197642 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.197682 4880 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b10a4a56-6a73-4034-b795-ff8a7137991e" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.201324 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.201353 4880 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b10a4a56-6a73-4034-b795-ff8a7137991e" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.253370 4880 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.739980 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.740075 4880 scope.go:117] "RemoveContainer" containerID="b657f6f3a79bfe1e09ccb4cf97d1635aa2febec0c8c4049551c4f1537e05c526" Feb 18 11:56:21 crc kubenswrapper[4880]: I0218 11:56:21.740192 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.828587 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsnd4"] Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.829672 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bsnd4" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="registry-server" containerID="cri-o://cbbf1a727305bab64e5c88bcbd1120444b4a5013cc7df5fff5e6257f64763856" gracePeriod=30 Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.836429 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zrsr"] Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.836816 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5zrsr" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="registry-server" containerID="cri-o://10cea74274f2ef4c50e3156484da5a945dd7813e9e9d19605c5920c5ff2b887b" gracePeriod=30 Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.849080 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xzmw2"] Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.849468 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerName="marketplace-operator" containerID="cri-o://c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7" gracePeriod=30 Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.859677 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rr7dq"] Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.860177 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rr7dq" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="registry-server" containerID="cri-o://0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6" gracePeriod=30 Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.870029 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cbfgr"] Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.870385 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cbfgr" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="registry-server" containerID="cri-o://59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703" gracePeriod=30 Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.888969 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-692kc"] Feb 18 11:56:26 crc kubenswrapper[4880]: E0218 11:56:26.889303 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" containerName="installer" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.889324 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" containerName="installer" Feb 18 11:56:26 crc kubenswrapper[4880]: E0218 11:56:26.889340 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.889349 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.889465 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.889483 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2631ac-d27b-47f0-928a-aff9b9f00d30" containerName="installer" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.890028 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.910849 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-692kc"] Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.930687 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh978\" (UniqueName: \"kubernetes.io/projected/bc432e98-ce9f-455b-b3c7-b254dfb4a649-kube-api-access-mh978\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.930753 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc432e98-ce9f-455b-b3c7-b254dfb4a649-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:26 crc kubenswrapper[4880]: I0218 11:56:26.931072 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc432e98-ce9f-455b-b3c7-b254dfb4a649-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.032812 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc432e98-ce9f-455b-b3c7-b254dfb4a649-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.032902 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh978\" (UniqueName: \"kubernetes.io/projected/bc432e98-ce9f-455b-b3c7-b254dfb4a649-kube-api-access-mh978\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.032935 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc432e98-ce9f-455b-b3c7-b254dfb4a649-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.034195 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc432e98-ce9f-455b-b3c7-b254dfb4a649-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.042452 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bc432e98-ce9f-455b-b3c7-b254dfb4a649-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.050670 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh978\" (UniqueName: \"kubernetes.io/projected/bc432e98-ce9f-455b-b3c7-b254dfb4a649-kube-api-access-mh978\") pod \"marketplace-operator-79b997595-692kc\" (UID: \"bc432e98-ce9f-455b-b3c7-b254dfb4a649\") " pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.216095 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.347385 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.351913 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.361691 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.418080 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-692kc"] Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.443904 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-catalog-content\") pod \"95755e2d-9c90-48ce-8141-7c209fd6d193\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444010 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psf94\" (UniqueName: \"kubernetes.io/projected/c34a3987-0ad6-4165-a6cf-717d47fea5fb-kube-api-access-psf94\") pod \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444100 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-trusted-ca\") pod \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444150 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-utilities\") pod \"95755e2d-9c90-48ce-8141-7c209fd6d193\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444225 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pjcv\" (UniqueName: \"kubernetes.io/projected/95755e2d-9c90-48ce-8141-7c209fd6d193-kube-api-access-9pjcv\") pod \"95755e2d-9c90-48ce-8141-7c209fd6d193\" (UID: \"95755e2d-9c90-48ce-8141-7c209fd6d193\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444259 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-operator-metrics\") pod \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\" (UID: \"c34a3987-0ad6-4165-a6cf-717d47fea5fb\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444385 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-catalog-content\") pod \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444410 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-utilities\") pod \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.444433 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qh96\" (UniqueName: \"kubernetes.io/projected/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-kube-api-access-5qh96\") pod \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\" (UID: \"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7\") " Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.446977 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c34a3987-0ad6-4165-a6cf-717d47fea5fb" (UID: "c34a3987-0ad6-4165-a6cf-717d47fea5fb"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.448177 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-utilities" (OuterVolumeSpecName: "utilities") pod "19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" (UID: "19fc1b7a-8fb8-4231-968c-4f6b3ff973e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.448588 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-utilities" (OuterVolumeSpecName: "utilities") pod "95755e2d-9c90-48ce-8141-7c209fd6d193" (UID: "95755e2d-9c90-48ce-8141-7c209fd6d193"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.449373 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-kube-api-access-5qh96" (OuterVolumeSpecName: "kube-api-access-5qh96") pod "19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" (UID: "19fc1b7a-8fb8-4231-968c-4f6b3ff973e7"). InnerVolumeSpecName "kube-api-access-5qh96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.451206 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34a3987-0ad6-4165-a6cf-717d47fea5fb-kube-api-access-psf94" (OuterVolumeSpecName: "kube-api-access-psf94") pod "c34a3987-0ad6-4165-a6cf-717d47fea5fb" (UID: "c34a3987-0ad6-4165-a6cf-717d47fea5fb"). InnerVolumeSpecName "kube-api-access-psf94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.451651 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c34a3987-0ad6-4165-a6cf-717d47fea5fb" (UID: "c34a3987-0ad6-4165-a6cf-717d47fea5fb"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.452438 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95755e2d-9c90-48ce-8141-7c209fd6d193-kube-api-access-9pjcv" (OuterVolumeSpecName: "kube-api-access-9pjcv") pod "95755e2d-9c90-48ce-8141-7c209fd6d193" (UID: "95755e2d-9c90-48ce-8141-7c209fd6d193"). InnerVolumeSpecName "kube-api-access-9pjcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.475074 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" (UID: "19fc1b7a-8fb8-4231-968c-4f6b3ff973e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545787 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psf94\" (UniqueName: \"kubernetes.io/projected/c34a3987-0ad6-4165-a6cf-717d47fea5fb-kube-api-access-psf94\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545827 4880 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545836 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545849 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pjcv\" (UniqueName: \"kubernetes.io/projected/95755e2d-9c90-48ce-8141-7c209fd6d193-kube-api-access-9pjcv\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545861 4880 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c34a3987-0ad6-4165-a6cf-717d47fea5fb-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545873 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545886 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.545895 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qh96\" (UniqueName: \"kubernetes.io/projected/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7-kube-api-access-5qh96\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.585540 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95755e2d-9c90-48ce-8141-7c209fd6d193" (UID: "95755e2d-9c90-48ce-8141-7c209fd6d193"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.647053 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95755e2d-9c90-48ce-8141-7c209fd6d193-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.790442 4880 generic.go:334] "Generic (PLEG): container finished" podID="e333b514-7367-498c-9660-500e02cfb188" containerID="10cea74274f2ef4c50e3156484da5a945dd7813e9e9d19605c5920c5ff2b887b" exitCode=0 Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.790497 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrsr" event={"ID":"e333b514-7367-498c-9660-500e02cfb188","Type":"ContainerDied","Data":"10cea74274f2ef4c50e3156484da5a945dd7813e9e9d19605c5920c5ff2b887b"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.792428 4880 generic.go:334] "Generic (PLEG): container finished" podID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerID="cbbf1a727305bab64e5c88bcbd1120444b4a5013cc7df5fff5e6257f64763856" exitCode=0 Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.792498 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnd4" event={"ID":"40e949a6-d734-42e6-9423-8597f6d4c9de","Type":"ContainerDied","Data":"cbbf1a727305bab64e5c88bcbd1120444b4a5013cc7df5fff5e6257f64763856"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.793939 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-692kc" event={"ID":"bc432e98-ce9f-455b-b3c7-b254dfb4a649","Type":"ContainerStarted","Data":"f29c7f58c5d59ae959df8a58855583752415e31b2545f505f82d256fc2dcab50"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.795800 4880 generic.go:334] "Generic (PLEG): container finished" podID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerID="c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7" exitCode=0 Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.795860 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.795867 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" event={"ID":"c34a3987-0ad6-4165-a6cf-717d47fea5fb","Type":"ContainerDied","Data":"c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.795898 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xzmw2" event={"ID":"c34a3987-0ad6-4165-a6cf-717d47fea5fb","Type":"ContainerDied","Data":"5dcb1b7c2dde4810da9af17461c85a7e7b64be9d753c6536693ccc24349354f7"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.795925 4880 scope.go:117] "RemoveContainer" containerID="c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.798964 4880 generic.go:334] "Generic (PLEG): container finished" podID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerID="59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703" exitCode=0 Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.799049 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbfgr" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.798998 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbfgr" event={"ID":"95755e2d-9c90-48ce-8141-7c209fd6d193","Type":"ContainerDied","Data":"59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.799115 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbfgr" event={"ID":"95755e2d-9c90-48ce-8141-7c209fd6d193","Type":"ContainerDied","Data":"59f243e58d09443974564189b46ae65b4242341133f9e29debb9954f8d29a538"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.809085 4880 generic.go:334] "Generic (PLEG): container finished" podID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerID="0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6" exitCode=0 Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.809135 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rr7dq" event={"ID":"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7","Type":"ContainerDied","Data":"0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.809154 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rr7dq" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.809171 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rr7dq" event={"ID":"19fc1b7a-8fb8-4231-968c-4f6b3ff973e7","Type":"ContainerDied","Data":"d97a34a1755e3a664ffa5eb76f2b70fe7f9e91371f0e4855fedf75c7a9abc744"} Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.821827 4880 scope.go:117] "RemoveContainer" containerID="c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7" Feb 18 11:56:27 crc kubenswrapper[4880]: E0218 11:56:27.822322 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7\": container with ID starting with c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7 not found: ID does not exist" containerID="c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.822350 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7"} err="failed to get container status \"c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7\": rpc error: code = NotFound desc = could not find container \"c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7\": container with ID starting with c5b642f05577ea1e8bbdfacb2ddb18b1beceb96ee347af8fc15d4ec93096adc7 not found: ID does not exist" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.822373 4880 scope.go:117] "RemoveContainer" containerID="59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.834904 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cbfgr"] Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.848635 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cbfgr"] Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.867766 4880 scope.go:117] "RemoveContainer" containerID="24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.876195 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xzmw2"] Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.888095 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xzmw2"] Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.895706 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rr7dq"] Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.898054 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rr7dq"] Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.922417 4880 scope.go:117] "RemoveContainer" containerID="62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.946014 4880 scope.go:117] "RemoveContainer" containerID="59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703" Feb 18 11:56:27 crc kubenswrapper[4880]: E0218 11:56:27.947159 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703\": container with ID starting with 59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703 not found: ID does not exist" containerID="59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.947191 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703"} err="failed to get container status \"59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703\": rpc error: code = NotFound desc = could not find container \"59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703\": container with ID starting with 59b5ce5b7b2b02614d3e73046f6688e859e27d9ffaaa944110b5808cd5164703 not found: ID does not exist" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.947219 4880 scope.go:117] "RemoveContainer" containerID="24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f" Feb 18 11:56:27 crc kubenswrapper[4880]: E0218 11:56:27.947737 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f\": container with ID starting with 24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f not found: ID does not exist" containerID="24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.947753 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f"} err="failed to get container status \"24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f\": rpc error: code = NotFound desc = could not find container \"24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f\": container with ID starting with 24adc1b2035d779e7942ea43ee0c091fd89b42f92a4bb3d3f3c6c98874ad214f not found: ID does not exist" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.947765 4880 scope.go:117] "RemoveContainer" containerID="62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b" Feb 18 11:56:27 crc kubenswrapper[4880]: E0218 11:56:27.948028 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b\": container with ID starting with 62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b not found: ID does not exist" containerID="62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.948072 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b"} err="failed to get container status \"62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b\": rpc error: code = NotFound desc = could not find container \"62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b\": container with ID starting with 62070175e6f7257d72d22f726a2a193fb685cd98acb3222b24ba7d7348bd497b not found: ID does not exist" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.948085 4880 scope.go:117] "RemoveContainer" containerID="0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6" Feb 18 11:56:27 crc kubenswrapper[4880]: I0218 11:56:27.990236 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.018392 4880 scope.go:117] "RemoveContainer" containerID="acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.021007 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.067457 4880 scope.go:117] "RemoveContainer" containerID="ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.067805 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-utilities\") pod \"40e949a6-d734-42e6-9423-8597f6d4c9de\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.067896 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fssnx\" (UniqueName: \"kubernetes.io/projected/40e949a6-d734-42e6-9423-8597f6d4c9de-kube-api-access-fssnx\") pod \"40e949a6-d734-42e6-9423-8597f6d4c9de\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.067991 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-catalog-content\") pod \"40e949a6-d734-42e6-9423-8597f6d4c9de\" (UID: \"40e949a6-d734-42e6-9423-8597f6d4c9de\") " Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.069020 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-utilities" (OuterVolumeSpecName: "utilities") pod "40e949a6-d734-42e6-9423-8597f6d4c9de" (UID: "40e949a6-d734-42e6-9423-8597f6d4c9de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.074309 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e949a6-d734-42e6-9423-8597f6d4c9de-kube-api-access-fssnx" (OuterVolumeSpecName: "kube-api-access-fssnx") pod "40e949a6-d734-42e6-9423-8597f6d4c9de" (UID: "40e949a6-d734-42e6-9423-8597f6d4c9de"). InnerVolumeSpecName "kube-api-access-fssnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.079240 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.086073 4880 scope.go:117] "RemoveContainer" containerID="0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6" Feb 18 11:56:28 crc kubenswrapper[4880]: E0218 11:56:28.086594 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6\": container with ID starting with 0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6 not found: ID does not exist" containerID="0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.086671 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6"} err="failed to get container status \"0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6\": rpc error: code = NotFound desc = could not find container \"0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6\": container with ID starting with 0181673d49d6f437ba088d7fffcb9014414835a23e80ef72446e10c78d5c2bc6 not found: ID does not exist" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.086736 4880 scope.go:117] "RemoveContainer" containerID="acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314" Feb 18 11:56:28 crc kubenswrapper[4880]: E0218 11:56:28.087240 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314\": container with ID starting with acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314 not found: ID does not exist" containerID="acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.087300 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314"} err="failed to get container status \"acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314\": rpc error: code = NotFound desc = could not find container \"acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314\": container with ID starting with acc0e76baaebe877b340d1d7a28c18ecd099fe1d3e15e58dccfd7f211144d314 not found: ID does not exist" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.087339 4880 scope.go:117] "RemoveContainer" containerID="ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e" Feb 18 11:56:28 crc kubenswrapper[4880]: E0218 11:56:28.087846 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e\": container with ID starting with ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e not found: ID does not exist" containerID="ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.087883 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e"} err="failed to get container status \"ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e\": rpc error: code = NotFound desc = could not find container \"ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e\": container with ID starting with ffb5d770ce05709dff0a4054c9b35896562b25ce188d445d3a938b4be870594e not found: ID does not exist" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.121799 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40e949a6-d734-42e6-9423-8597f6d4c9de" (UID: "40e949a6-d734-42e6-9423-8597f6d4c9de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.168774 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8wnc\" (UniqueName: \"kubernetes.io/projected/e333b514-7367-498c-9660-500e02cfb188-kube-api-access-c8wnc\") pod \"e333b514-7367-498c-9660-500e02cfb188\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.168842 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-catalog-content\") pod \"e333b514-7367-498c-9660-500e02cfb188\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.168958 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-utilities\") pod \"e333b514-7367-498c-9660-500e02cfb188\" (UID: \"e333b514-7367-498c-9660-500e02cfb188\") " Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.169291 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.169318 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fssnx\" (UniqueName: \"kubernetes.io/projected/40e949a6-d734-42e6-9423-8597f6d4c9de-kube-api-access-fssnx\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.169333 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e949a6-d734-42e6-9423-8597f6d4c9de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.169902 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-utilities" (OuterVolumeSpecName: "utilities") pod "e333b514-7367-498c-9660-500e02cfb188" (UID: "e333b514-7367-498c-9660-500e02cfb188"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.171859 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e333b514-7367-498c-9660-500e02cfb188-kube-api-access-c8wnc" (OuterVolumeSpecName: "kube-api-access-c8wnc") pod "e333b514-7367-498c-9660-500e02cfb188" (UID: "e333b514-7367-498c-9660-500e02cfb188"). InnerVolumeSpecName "kube-api-access-c8wnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.220555 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e333b514-7367-498c-9660-500e02cfb188" (UID: "e333b514-7367-498c-9660-500e02cfb188"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.270721 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.270762 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8wnc\" (UniqueName: \"kubernetes.io/projected/e333b514-7367-498c-9660-500e02cfb188-kube-api-access-c8wnc\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.270776 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e333b514-7367-498c-9660-500e02cfb188-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.816015 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsnd4" event={"ID":"40e949a6-d734-42e6-9423-8597f6d4c9de","Type":"ContainerDied","Data":"89ae412976a8ddaea91d832224ed7cfd80f920d0e3e9116982e656ab66f9dd01"} Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.816124 4880 scope.go:117] "RemoveContainer" containerID="cbbf1a727305bab64e5c88bcbd1120444b4a5013cc7df5fff5e6257f64763856" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.816042 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsnd4" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.818466 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-692kc" event={"ID":"bc432e98-ce9f-455b-b3c7-b254dfb4a649","Type":"ContainerStarted","Data":"bab54daa041e9beba4078798adc322b8c4288fa8cddadbc288bdf2029424ca7f"} Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.818645 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.826542 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrsr" event={"ID":"e333b514-7367-498c-9660-500e02cfb188","Type":"ContainerDied","Data":"df90c46bab9866bbdfbc6323b0766f989109dbdd6dd9833cb4031fd3b6f237ef"} Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.826700 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-692kc" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.826799 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrsr" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.831290 4880 scope.go:117] "RemoveContainer" containerID="e50e2e35c15ec108485d909892bd52a7f0b252a0134319da39beadfc23e2e38e" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.837375 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-692kc" podStartSLOduration=2.837351904 podStartE2EDuration="2.837351904s" podCreationTimestamp="2026-02-18 11:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:56:28.837095627 +0000 UTC m=+296.265996508" watchObservedRunningTime="2026-02-18 11:56:28.837351904 +0000 UTC m=+296.266252765" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.857345 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsnd4"] Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.858882 4880 scope.go:117] "RemoveContainer" containerID="5c901d2364328bd2f2d119d06037af4cab4f981a6053601304d341ee3d05b90a" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.862451 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bsnd4"] Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.876190 4880 scope.go:117] "RemoveContainer" containerID="10cea74274f2ef4c50e3156484da5a945dd7813e9e9d19605c5920c5ff2b887b" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.894155 4880 scope.go:117] "RemoveContainer" containerID="daec6b250c4131128b4d0394905996015979024a9a0b57c557465abb935a7667" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.894694 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zrsr"] Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.904051 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5zrsr"] Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.913422 4880 scope.go:117] "RemoveContainer" containerID="1929907c176b3fbd8247311ed5caadebfadcc54bfe10c0d2000e89cf2dbe61e1" Feb 18 11:56:28 crc kubenswrapper[4880]: I0218 11:56:28.930984 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 11:56:29 crc kubenswrapper[4880]: I0218 11:56:29.188380 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" path="/var/lib/kubelet/pods/19fc1b7a-8fb8-4231-968c-4f6b3ff973e7/volumes" Feb 18 11:56:29 crc kubenswrapper[4880]: I0218 11:56:29.189051 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" path="/var/lib/kubelet/pods/40e949a6-d734-42e6-9423-8597f6d4c9de/volumes" Feb 18 11:56:29 crc kubenswrapper[4880]: I0218 11:56:29.189596 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" path="/var/lib/kubelet/pods/95755e2d-9c90-48ce-8141-7c209fd6d193/volumes" Feb 18 11:56:29 crc kubenswrapper[4880]: I0218 11:56:29.191004 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" path="/var/lib/kubelet/pods/c34a3987-0ad6-4165-a6cf-717d47fea5fb/volumes" Feb 18 11:56:29 crc kubenswrapper[4880]: I0218 11:56:29.191712 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e333b514-7367-498c-9660-500e02cfb188" path="/var/lib/kubelet/pods/e333b514-7367-498c-9660-500e02cfb188/volumes" Feb 18 11:56:31 crc kubenswrapper[4880]: I0218 11:56:31.218854 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 11:56:33 crc kubenswrapper[4880]: I0218 11:56:33.009391 4880 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 18 11:56:35 crc kubenswrapper[4880]: I0218 11:56:35.447702 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 11:56:35 crc kubenswrapper[4880]: I0218 11:56:35.557856 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 11:56:37 crc kubenswrapper[4880]: I0218 11:56:37.343169 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 11:56:40 crc kubenswrapper[4880]: I0218 11:56:40.310658 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 11:56:42 crc kubenswrapper[4880]: I0218 11:56:42.715738 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 11:56:42 crc kubenswrapper[4880]: I0218 11:56:42.987441 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 11:56:43 crc kubenswrapper[4880]: I0218 11:56:43.953574 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 11:56:44 crc kubenswrapper[4880]: I0218 11:56:44.832712 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 11:56:47 crc kubenswrapper[4880]: I0218 11:56:47.282755 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 11:56:48 crc kubenswrapper[4880]: I0218 11:56:48.775278 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 11:56:48 crc kubenswrapper[4880]: I0218 11:56:48.798455 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 11:56:48 crc kubenswrapper[4880]: I0218 11:56:48.866717 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 11:56:53 crc kubenswrapper[4880]: I0218 11:56:53.230551 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 11:56:53 crc kubenswrapper[4880]: I0218 11:56:53.451822 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 11:56:54 crc kubenswrapper[4880]: I0218 11:56:54.993246 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 11:56:57 crc kubenswrapper[4880]: I0218 11:56:57.424499 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 11:57:20 crc kubenswrapper[4880]: I0218 11:57:20.937708 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxz72"] Feb 18 11:57:20 crc kubenswrapper[4880]: I0218 11:57:20.938529 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" podUID="9f0337cb-0aed-41af-b587-a16392350413" containerName="controller-manager" containerID="cri-o://54cb11ceecfa24c056963ad9ae9c6b06764f799f7bcdb9e4b2c7ff211397af6f" gracePeriod=30 Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.032960 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49"] Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.033283 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" podUID="fbca9013-a5ce-4d98-8552-503f5b2d8f45" containerName="route-controller-manager" containerID="cri-o://b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb" gracePeriod=30 Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.133160 4880 generic.go:334] "Generic (PLEG): container finished" podID="9f0337cb-0aed-41af-b587-a16392350413" containerID="54cb11ceecfa24c056963ad9ae9c6b06764f799f7bcdb9e4b2c7ff211397af6f" exitCode=0 Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.133216 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" event={"ID":"9f0337cb-0aed-41af-b587-a16392350413","Type":"ContainerDied","Data":"54cb11ceecfa24c056963ad9ae9c6b06764f799f7bcdb9e4b2c7ff211397af6f"} Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.309601 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.413978 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.492332 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-config\") pod \"9f0337cb-0aed-41af-b587-a16392350413\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.492400 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-proxy-ca-bundles\") pod \"9f0337cb-0aed-41af-b587-a16392350413\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.492445 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0337cb-0aed-41af-b587-a16392350413-serving-cert\") pod \"9f0337cb-0aed-41af-b587-a16392350413\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.492486 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qscth\" (UniqueName: \"kubernetes.io/projected/9f0337cb-0aed-41af-b587-a16392350413-kube-api-access-qscth\") pod \"9f0337cb-0aed-41af-b587-a16392350413\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.492531 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-client-ca\") pod \"9f0337cb-0aed-41af-b587-a16392350413\" (UID: \"9f0337cb-0aed-41af-b587-a16392350413\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.493230 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9f0337cb-0aed-41af-b587-a16392350413" (UID: "9f0337cb-0aed-41af-b587-a16392350413"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.493250 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f0337cb-0aed-41af-b587-a16392350413" (UID: "9f0337cb-0aed-41af-b587-a16392350413"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.493277 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-config" (OuterVolumeSpecName: "config") pod "9f0337cb-0aed-41af-b587-a16392350413" (UID: "9f0337cb-0aed-41af-b587-a16392350413"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.498008 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0337cb-0aed-41af-b587-a16392350413-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f0337cb-0aed-41af-b587-a16392350413" (UID: "9f0337cb-0aed-41af-b587-a16392350413"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.498519 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0337cb-0aed-41af-b587-a16392350413-kube-api-access-qscth" (OuterVolumeSpecName: "kube-api-access-qscth") pod "9f0337cb-0aed-41af-b587-a16392350413" (UID: "9f0337cb-0aed-41af-b587-a16392350413"). InnerVolumeSpecName "kube-api-access-qscth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.593682 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-client-ca\") pod \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.593747 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-config\") pod \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.593832 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbca9013-a5ce-4d98-8552-503f5b2d8f45-serving-cert\") pod \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.593881 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flsmp\" (UniqueName: \"kubernetes.io/projected/fbca9013-a5ce-4d98-8552-503f5b2d8f45-kube-api-access-flsmp\") pod \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\" (UID: \"fbca9013-a5ce-4d98-8552-503f5b2d8f45\") " Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.594141 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qscth\" (UniqueName: \"kubernetes.io/projected/9f0337cb-0aed-41af-b587-a16392350413-kube-api-access-qscth\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.594157 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.594167 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.594177 4880 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f0337cb-0aed-41af-b587-a16392350413-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.594207 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0337cb-0aed-41af-b587-a16392350413-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.594517 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-client-ca" (OuterVolumeSpecName: "client-ca") pod "fbca9013-a5ce-4d98-8552-503f5b2d8f45" (UID: "fbca9013-a5ce-4d98-8552-503f5b2d8f45"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.594534 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-config" (OuterVolumeSpecName: "config") pod "fbca9013-a5ce-4d98-8552-503f5b2d8f45" (UID: "fbca9013-a5ce-4d98-8552-503f5b2d8f45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.597598 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbca9013-a5ce-4d98-8552-503f5b2d8f45-kube-api-access-flsmp" (OuterVolumeSpecName: "kube-api-access-flsmp") pod "fbca9013-a5ce-4d98-8552-503f5b2d8f45" (UID: "fbca9013-a5ce-4d98-8552-503f5b2d8f45"). InnerVolumeSpecName "kube-api-access-flsmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.598063 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbca9013-a5ce-4d98-8552-503f5b2d8f45-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fbca9013-a5ce-4d98-8552-503f5b2d8f45" (UID: "fbca9013-a5ce-4d98-8552-503f5b2d8f45"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.695436 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbca9013-a5ce-4d98-8552-503f5b2d8f45-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.695480 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flsmp\" (UniqueName: \"kubernetes.io/projected/fbca9013-a5ce-4d98-8552-503f5b2d8f45-kube-api-access-flsmp\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.695492 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:21 crc kubenswrapper[4880]: I0218 11:57:21.695509 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbca9013-a5ce-4d98-8552-503f5b2d8f45-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.142861 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" event={"ID":"9f0337cb-0aed-41af-b587-a16392350413","Type":"ContainerDied","Data":"09f3b432b886cc050f49fb7ad116e7a59fe9fd6c3477aad4ded888ac13f47d35"} Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.142901 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxz72" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.142946 4880 scope.go:117] "RemoveContainer" containerID="54cb11ceecfa24c056963ad9ae9c6b06764f799f7bcdb9e4b2c7ff211397af6f" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.147326 4880 generic.go:334] "Generic (PLEG): container finished" podID="fbca9013-a5ce-4d98-8552-503f5b2d8f45" containerID="b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb" exitCode=0 Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.147375 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" event={"ID":"fbca9013-a5ce-4d98-8552-503f5b2d8f45","Type":"ContainerDied","Data":"b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb"} Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.147428 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" event={"ID":"fbca9013-a5ce-4d98-8552-503f5b2d8f45","Type":"ContainerDied","Data":"1a2cfcc39ecc2621153ce25906d4dbc2480e0bbfaeb4460193173e1576002885"} Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.147549 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.163973 4880 scope.go:117] "RemoveContainer" containerID="b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.182935 4880 scope.go:117] "RemoveContainer" containerID="b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.183557 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb\": container with ID starting with b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb not found: ID does not exist" containerID="b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.183650 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb"} err="failed to get container status \"b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb\": rpc error: code = NotFound desc = could not find container \"b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb\": container with ID starting with b6edfb95ed6f56d6daf348fefba92264bf1afa19531656467a840a364c410bcb not found: ID does not exist" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.185555 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49"] Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.191310 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-whp49"] Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.201096 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxz72"] Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.204248 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxz72"] Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.450979 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk"] Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451450 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerName="marketplace-operator" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451476 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerName="marketplace-operator" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451507 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451520 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451538 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451552 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451569 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451582 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451602 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451643 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451664 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451677 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451691 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbca9013-a5ce-4d98-8552-503f5b2d8f45" containerName="route-controller-manager" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451704 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbca9013-a5ce-4d98-8552-503f5b2d8f45" containerName="route-controller-manager" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451728 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451741 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="extract-utilities" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451761 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451774 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451796 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451837 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451856 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451869 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451893 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451909 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451929 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451941 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451962 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0337cb-0aed-41af-b587-a16392350413" containerName="controller-manager" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.451975 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0337cb-0aed-41af-b587-a16392350413" containerName="controller-manager" Feb 18 11:57:22 crc kubenswrapper[4880]: E0218 11:57:22.451992 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452004 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="extract-content" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452158 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fc1b7a-8fb8-4231-968c-4f6b3ff973e7" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452186 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0337cb-0aed-41af-b587-a16392350413" containerName="controller-manager" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452204 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="95755e2d-9c90-48ce-8141-7c209fd6d193" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452224 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="e333b514-7367-498c-9660-500e02cfb188" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452240 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e949a6-d734-42e6-9423-8597f6d4c9de" containerName="registry-server" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452257 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbca9013-a5ce-4d98-8552-503f5b2d8f45" containerName="route-controller-manager" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.452272 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34a3987-0ad6-4165-a6cf-717d47fea5fb" containerName="marketplace-operator" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.453007 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.455656 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.456074 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.457242 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.458075 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.458168 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.458522 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.467015 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-559465bd67-5sxdl"] Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.468395 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.474651 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.474916 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.475170 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.475791 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.476026 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.476315 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.479788 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk"] Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.485504 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.486143 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-559465bd67-5sxdl"] Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505078 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-proxy-ca-bundles\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505141 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-config\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505159 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-client-ca\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505186 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-client-ca\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505207 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-config\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505222 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llvgq\" (UniqueName: \"kubernetes.io/projected/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-kube-api-access-llvgq\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505260 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-serving-cert\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505280 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17966ec8-2694-480d-9cd0-fd8a7c92e342-serving-cert\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.505300 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnzb6\" (UniqueName: \"kubernetes.io/projected/17966ec8-2694-480d-9cd0-fd8a7c92e342-kube-api-access-qnzb6\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606399 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-client-ca\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606469 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-config\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606489 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llvgq\" (UniqueName: \"kubernetes.io/projected/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-kube-api-access-llvgq\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606528 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-serving-cert\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606554 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17966ec8-2694-480d-9cd0-fd8a7c92e342-serving-cert\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606573 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnzb6\" (UniqueName: \"kubernetes.io/projected/17966ec8-2694-480d-9cd0-fd8a7c92e342-kube-api-access-qnzb6\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606599 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-proxy-ca-bundles\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606640 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-client-ca\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.606660 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-config\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.607808 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-client-ca\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.608400 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-config\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.608642 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-config\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.608708 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-client-ca\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.609096 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-proxy-ca-bundles\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.610883 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-serving-cert\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.615502 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17966ec8-2694-480d-9cd0-fd8a7c92e342-serving-cert\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.629454 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llvgq\" (UniqueName: \"kubernetes.io/projected/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-kube-api-access-llvgq\") pod \"route-controller-manager-cdfcd5df-7p4sk\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.635084 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnzb6\" (UniqueName: \"kubernetes.io/projected/17966ec8-2694-480d-9cd0-fd8a7c92e342-kube-api-access-qnzb6\") pod \"controller-manager-559465bd67-5sxdl\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.784468 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:22 crc kubenswrapper[4880]: I0218 11:57:22.802750 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:23 crc kubenswrapper[4880]: I0218 11:57:23.078198 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-559465bd67-5sxdl"] Feb 18 11:57:23 crc kubenswrapper[4880]: I0218 11:57:23.107603 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk"] Feb 18 11:57:23 crc kubenswrapper[4880]: W0218 11:57:23.113821 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12baeda4_4cb8_4bc6_af12_07c20d71fb6b.slice/crio-e0ec8dd50d01dc241ae2ef15ffaaf4422f1e400bf291a90a81e8680a0b7e320a WatchSource:0}: Error finding container e0ec8dd50d01dc241ae2ef15ffaaf4422f1e400bf291a90a81e8680a0b7e320a: Status 404 returned error can't find the container with id e0ec8dd50d01dc241ae2ef15ffaaf4422f1e400bf291a90a81e8680a0b7e320a Feb 18 11:57:23 crc kubenswrapper[4880]: I0218 11:57:23.162271 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" event={"ID":"17966ec8-2694-480d-9cd0-fd8a7c92e342","Type":"ContainerStarted","Data":"84b45d6f6710e11c52198b5dbc32b80d3efefa4daf745ad21d96a285dd22dc4f"} Feb 18 11:57:23 crc kubenswrapper[4880]: I0218 11:57:23.164677 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" event={"ID":"12baeda4-4cb8-4bc6-af12-07c20d71fb6b","Type":"ContainerStarted","Data":"e0ec8dd50d01dc241ae2ef15ffaaf4422f1e400bf291a90a81e8680a0b7e320a"} Feb 18 11:57:23 crc kubenswrapper[4880]: I0218 11:57:23.194747 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0337cb-0aed-41af-b587-a16392350413" path="/var/lib/kubelet/pods/9f0337cb-0aed-41af-b587-a16392350413/volumes" Feb 18 11:57:23 crc kubenswrapper[4880]: I0218 11:57:23.195428 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbca9013-a5ce-4d98-8552-503f5b2d8f45" path="/var/lib/kubelet/pods/fbca9013-a5ce-4d98-8552-503f5b2d8f45/volumes" Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.173169 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" event={"ID":"17966ec8-2694-480d-9cd0-fd8a7c92e342","Type":"ContainerStarted","Data":"57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66"} Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.173651 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.175962 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" event={"ID":"12baeda4-4cb8-4bc6-af12-07c20d71fb6b","Type":"ContainerStarted","Data":"08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f"} Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.176309 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.178074 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.180694 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.193838 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" podStartSLOduration=4.193815207 podStartE2EDuration="4.193815207s" podCreationTimestamp="2026-02-18 11:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:24.190904226 +0000 UTC m=+351.619805097" watchObservedRunningTime="2026-02-18 11:57:24.193815207 +0000 UTC m=+351.622716068" Feb 18 11:57:24 crc kubenswrapper[4880]: I0218 11:57:24.212019 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" podStartSLOduration=3.211999415 podStartE2EDuration="3.211999415s" podCreationTimestamp="2026-02-18 11:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:24.210849534 +0000 UTC m=+351.639750405" watchObservedRunningTime="2026-02-18 11:57:24.211999415 +0000 UTC m=+351.640900276" Feb 18 11:57:28 crc kubenswrapper[4880]: I0218 11:57:28.409736 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-559465bd67-5sxdl"] Feb 18 11:57:28 crc kubenswrapper[4880]: I0218 11:57:28.410476 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" podUID="17966ec8-2694-480d-9cd0-fd8a7c92e342" containerName="controller-manager" containerID="cri-o://57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66" gracePeriod=30 Feb 18 11:57:28 crc kubenswrapper[4880]: I0218 11:57:28.436795 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk"] Feb 18 11:57:28 crc kubenswrapper[4880]: I0218 11:57:28.437020 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" podUID="12baeda4-4cb8-4bc6-af12-07c20d71fb6b" containerName="route-controller-manager" containerID="cri-o://08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f" gracePeriod=30 Feb 18 11:57:28 crc kubenswrapper[4880]: I0218 11:57:28.977107 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.037933 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.104997 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-client-ca\") pod \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.105129 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-config\") pod \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.105152 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-serving-cert\") pod \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.105189 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llvgq\" (UniqueName: \"kubernetes.io/projected/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-kube-api-access-llvgq\") pod \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\" (UID: \"12baeda4-4cb8-4bc6-af12-07c20d71fb6b\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.106465 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-config" (OuterVolumeSpecName: "config") pod "12baeda4-4cb8-4bc6-af12-07c20d71fb6b" (UID: "12baeda4-4cb8-4bc6-af12-07c20d71fb6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.106882 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "12baeda4-4cb8-4bc6-af12-07c20d71fb6b" (UID: "12baeda4-4cb8-4bc6-af12-07c20d71fb6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.112490 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12baeda4-4cb8-4bc6-af12-07c20d71fb6b" (UID: "12baeda4-4cb8-4bc6-af12-07c20d71fb6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.112774 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-kube-api-access-llvgq" (OuterVolumeSpecName: "kube-api-access-llvgq") pod "12baeda4-4cb8-4bc6-af12-07c20d71fb6b" (UID: "12baeda4-4cb8-4bc6-af12-07c20d71fb6b"). InnerVolumeSpecName "kube-api-access-llvgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206258 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnzb6\" (UniqueName: \"kubernetes.io/projected/17966ec8-2694-480d-9cd0-fd8a7c92e342-kube-api-access-qnzb6\") pod \"17966ec8-2694-480d-9cd0-fd8a7c92e342\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206299 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17966ec8-2694-480d-9cd0-fd8a7c92e342-serving-cert\") pod \"17966ec8-2694-480d-9cd0-fd8a7c92e342\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206320 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-config\") pod \"17966ec8-2694-480d-9cd0-fd8a7c92e342\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206348 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-proxy-ca-bundles\") pod \"17966ec8-2694-480d-9cd0-fd8a7c92e342\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206405 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-client-ca\") pod \"17966ec8-2694-480d-9cd0-fd8a7c92e342\" (UID: \"17966ec8-2694-480d-9cd0-fd8a7c92e342\") " Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206563 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206575 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206585 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llvgq\" (UniqueName: \"kubernetes.io/projected/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-kube-api-access-llvgq\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.206595 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12baeda4-4cb8-4bc6-af12-07c20d71fb6b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.208447 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-client-ca" (OuterVolumeSpecName: "client-ca") pod "17966ec8-2694-480d-9cd0-fd8a7c92e342" (UID: "17966ec8-2694-480d-9cd0-fd8a7c92e342"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.208447 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "17966ec8-2694-480d-9cd0-fd8a7c92e342" (UID: "17966ec8-2694-480d-9cd0-fd8a7c92e342"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.208473 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-config" (OuterVolumeSpecName: "config") pod "17966ec8-2694-480d-9cd0-fd8a7c92e342" (UID: "17966ec8-2694-480d-9cd0-fd8a7c92e342"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.210077 4880 generic.go:334] "Generic (PLEG): container finished" podID="12baeda4-4cb8-4bc6-af12-07c20d71fb6b" containerID="08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f" exitCode=0 Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.210211 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" event={"ID":"12baeda4-4cb8-4bc6-af12-07c20d71fb6b","Type":"ContainerDied","Data":"08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f"} Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.210252 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" event={"ID":"12baeda4-4cb8-4bc6-af12-07c20d71fb6b","Type":"ContainerDied","Data":"e0ec8dd50d01dc241ae2ef15ffaaf4422f1e400bf291a90a81e8680a0b7e320a"} Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.210274 4880 scope.go:117] "RemoveContainer" containerID="08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.210441 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.211266 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17966ec8-2694-480d-9cd0-fd8a7c92e342-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17966ec8-2694-480d-9cd0-fd8a7c92e342" (UID: "17966ec8-2694-480d-9cd0-fd8a7c92e342"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.211884 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17966ec8-2694-480d-9cd0-fd8a7c92e342-kube-api-access-qnzb6" (OuterVolumeSpecName: "kube-api-access-qnzb6") pod "17966ec8-2694-480d-9cd0-fd8a7c92e342" (UID: "17966ec8-2694-480d-9cd0-fd8a7c92e342"). InnerVolumeSpecName "kube-api-access-qnzb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.214950 4880 generic.go:334] "Generic (PLEG): container finished" podID="17966ec8-2694-480d-9cd0-fd8a7c92e342" containerID="57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66" exitCode=0 Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.215005 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" event={"ID":"17966ec8-2694-480d-9cd0-fd8a7c92e342","Type":"ContainerDied","Data":"57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66"} Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.215048 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" event={"ID":"17966ec8-2694-480d-9cd0-fd8a7c92e342","Type":"ContainerDied","Data":"84b45d6f6710e11c52198b5dbc32b80d3efefa4daf745ad21d96a285dd22dc4f"} Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.215145 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-559465bd67-5sxdl" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.221236 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-lvmlf"] Feb 18 11:57:29 crc kubenswrapper[4880]: E0218 11:57:29.221600 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17966ec8-2694-480d-9cd0-fd8a7c92e342" containerName="controller-manager" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.221632 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="17966ec8-2694-480d-9cd0-fd8a7c92e342" containerName="controller-manager" Feb 18 11:57:29 crc kubenswrapper[4880]: E0218 11:57:29.221652 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12baeda4-4cb8-4bc6-af12-07c20d71fb6b" containerName="route-controller-manager" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.221658 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="12baeda4-4cb8-4bc6-af12-07c20d71fb6b" containerName="route-controller-manager" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.221742 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="12baeda4-4cb8-4bc6-af12-07c20d71fb6b" containerName="route-controller-manager" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.221756 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="17966ec8-2694-480d-9cd0-fd8a7c92e342" containerName="controller-manager" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.222143 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.226364 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.226472 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.226761 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.226927 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.227926 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.231141 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.234070 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.238840 4880 scope.go:117] "RemoveContainer" containerID="08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f" Feb 18 11:57:29 crc kubenswrapper[4880]: E0218 11:57:29.239323 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f\": container with ID starting with 08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f not found: ID does not exist" containerID="08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.239387 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f"} err="failed to get container status \"08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f\": rpc error: code = NotFound desc = could not find container \"08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f\": container with ID starting with 08d39a2d6dd5227543567b3d850b082260fefc4a6fad799659d3c5e04359d69f not found: ID does not exist" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.239426 4880 scope.go:117] "RemoveContainer" containerID="57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.242074 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-lvmlf"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.263586 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.264321 4880 scope.go:117] "RemoveContainer" containerID="57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.264690 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: E0218 11:57:29.266977 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66\": container with ID starting with 57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66 not found: ID does not exist" containerID="57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.267039 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66"} err="failed to get container status \"57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66\": rpc error: code = NotFound desc = could not find container \"57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66\": container with ID starting with 57794a14d82981fb12ff4ea21ce9390d4ecfbec5987e8f0dce948afc01c1cb66 not found: ID does not exist" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.267262 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.268723 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.269674 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.270026 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.270196 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.271002 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.271168 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.274419 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdfcd5df-7p4sk"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.276298 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.313270 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnzb6\" (UniqueName: \"kubernetes.io/projected/17966ec8-2694-480d-9cd0-fd8a7c92e342-kube-api-access-qnzb6\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.313318 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17966ec8-2694-480d-9cd0-fd8a7c92e342-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.313336 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.313355 4880 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.313386 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17966ec8-2694-480d-9cd0-fd8a7c92e342-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.320109 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-559465bd67-5sxdl"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.323790 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-559465bd67-5sxdl"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.414867 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d880eba0-16b8-4949-bf8d-29175f4c5153-serving-cert\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.414931 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8df\" (UniqueName: \"kubernetes.io/projected/d880eba0-16b8-4949-bf8d-29175f4c5153-kube-api-access-bf8df\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.415026 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.415132 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-client-ca\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.415334 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-config\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.415433 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-config\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.415475 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-client-ca\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.415539 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks7p6\" (UniqueName: \"kubernetes.io/projected/cc4445a6-927c-41aa-98ae-9c11d734fdb1-kube-api-access-ks7p6\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.415576 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4445a6-927c-41aa-98ae-9c11d734fdb1-serving-cert\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516588 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-config\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516706 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-config\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516747 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-client-ca\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516804 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks7p6\" (UniqueName: \"kubernetes.io/projected/cc4445a6-927c-41aa-98ae-9c11d734fdb1-kube-api-access-ks7p6\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516831 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4445a6-927c-41aa-98ae-9c11d734fdb1-serving-cert\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516862 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d880eba0-16b8-4949-bf8d-29175f4c5153-serving-cert\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516889 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516913 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8df\" (UniqueName: \"kubernetes.io/projected/d880eba0-16b8-4949-bf8d-29175f4c5153-kube-api-access-bf8df\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.516934 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-client-ca\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.518239 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-client-ca\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.518349 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-client-ca\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.518472 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-config\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.518779 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.519791 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-config\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.521842 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4445a6-927c-41aa-98ae-9c11d734fdb1-serving-cert\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.521976 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d880eba0-16b8-4949-bf8d-29175f4c5153-serving-cert\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.537343 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks7p6\" (UniqueName: \"kubernetes.io/projected/cc4445a6-927c-41aa-98ae-9c11d734fdb1-kube-api-access-ks7p6\") pod \"controller-manager-d6f97d578-lvmlf\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.540013 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8df\" (UniqueName: \"kubernetes.io/projected/d880eba0-16b8-4949-bf8d-29175f4c5153-kube-api-access-bf8df\") pod \"route-controller-manager-76946b564d-4zbwv\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.551093 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.584762 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.868889 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-lvmlf"] Feb 18 11:57:29 crc kubenswrapper[4880]: I0218 11:57:29.926198 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv"] Feb 18 11:57:29 crc kubenswrapper[4880]: W0218 11:57:29.933414 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd880eba0_16b8_4949_bf8d_29175f4c5153.slice/crio-2f9cb3f639d7898da5949bc8933345e68ef5d880cca4d3bb47f66c7b20147eed WatchSource:0}: Error finding container 2f9cb3f639d7898da5949bc8933345e68ef5d880cca4d3bb47f66c7b20147eed: Status 404 returned error can't find the container with id 2f9cb3f639d7898da5949bc8933345e68ef5d880cca4d3bb47f66c7b20147eed Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.222484 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" event={"ID":"cc4445a6-927c-41aa-98ae-9c11d734fdb1","Type":"ContainerStarted","Data":"b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d"} Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.222545 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" event={"ID":"cc4445a6-927c-41aa-98ae-9c11d734fdb1","Type":"ContainerStarted","Data":"5e7ff8c669c2e09cf8a727fb672df27dcb01b47c1a1ebc68f2e6e619b34d77a9"} Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.222694 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.223853 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" event={"ID":"d880eba0-16b8-4949-bf8d-29175f4c5153","Type":"ContainerStarted","Data":"d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461"} Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.223889 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" event={"ID":"d880eba0-16b8-4949-bf8d-29175f4c5153","Type":"ContainerStarted","Data":"2f9cb3f639d7898da5949bc8933345e68ef5d880cca4d3bb47f66c7b20147eed"} Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.224081 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.232835 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.250603 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" podStartSLOduration=1.250582115 podStartE2EDuration="1.250582115s" podCreationTimestamp="2026-02-18 11:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:30.248421456 +0000 UTC m=+357.677322337" watchObservedRunningTime="2026-02-18 11:57:30.250582115 +0000 UTC m=+357.679482976" Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.758906 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:30 crc kubenswrapper[4880]: I0218 11:57:30.778556 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" podStartSLOduration=1.778535539 podStartE2EDuration="1.778535539s" podCreationTimestamp="2026-02-18 11:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:30.310776595 +0000 UTC m=+357.739677456" watchObservedRunningTime="2026-02-18 11:57:30.778535539 +0000 UTC m=+358.207436390" Feb 18 11:57:31 crc kubenswrapper[4880]: I0218 11:57:31.186342 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12baeda4-4cb8-4bc6-af12-07c20d71fb6b" path="/var/lib/kubelet/pods/12baeda4-4cb8-4bc6-af12-07c20d71fb6b/volumes" Feb 18 11:57:31 crc kubenswrapper[4880]: I0218 11:57:31.186954 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17966ec8-2694-480d-9cd0-fd8a7c92e342" path="/var/lib/kubelet/pods/17966ec8-2694-480d-9cd0-fd8a7c92e342/volumes" Feb 18 11:57:40 crc kubenswrapper[4880]: I0218 11:57:40.448023 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-lvmlf"] Feb 18 11:57:40 crc kubenswrapper[4880]: I0218 11:57:40.449195 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" podUID="cc4445a6-927c-41aa-98ae-9c11d734fdb1" containerName="controller-manager" containerID="cri-o://b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d" gracePeriod=30 Feb 18 11:57:40 crc kubenswrapper[4880]: I0218 11:57:40.455277 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv"] Feb 18 11:57:40 crc kubenswrapper[4880]: I0218 11:57:40.455492 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" podUID="d880eba0-16b8-4949-bf8d-29175f4c5153" containerName="route-controller-manager" containerID="cri-o://d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461" gracePeriod=30 Feb 18 11:57:40 crc kubenswrapper[4880]: I0218 11:57:40.955960 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.066646 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-config\") pod \"d880eba0-16b8-4949-bf8d-29175f4c5153\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.066703 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-client-ca\") pod \"d880eba0-16b8-4949-bf8d-29175f4c5153\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.066747 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d880eba0-16b8-4949-bf8d-29175f4c5153-serving-cert\") pod \"d880eba0-16b8-4949-bf8d-29175f4c5153\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.066769 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf8df\" (UniqueName: \"kubernetes.io/projected/d880eba0-16b8-4949-bf8d-29175f4c5153-kube-api-access-bf8df\") pod \"d880eba0-16b8-4949-bf8d-29175f4c5153\" (UID: \"d880eba0-16b8-4949-bf8d-29175f4c5153\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.067747 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-config" (OuterVolumeSpecName: "config") pod "d880eba0-16b8-4949-bf8d-29175f4c5153" (UID: "d880eba0-16b8-4949-bf8d-29175f4c5153"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.068314 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-client-ca" (OuterVolumeSpecName: "client-ca") pod "d880eba0-16b8-4949-bf8d-29175f4c5153" (UID: "d880eba0-16b8-4949-bf8d-29175f4c5153"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.072996 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d880eba0-16b8-4949-bf8d-29175f4c5153-kube-api-access-bf8df" (OuterVolumeSpecName: "kube-api-access-bf8df") pod "d880eba0-16b8-4949-bf8d-29175f4c5153" (UID: "d880eba0-16b8-4949-bf8d-29175f4c5153"). InnerVolumeSpecName "kube-api-access-bf8df". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.073810 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d880eba0-16b8-4949-bf8d-29175f4c5153-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d880eba0-16b8-4949-bf8d-29175f4c5153" (UID: "d880eba0-16b8-4949-bf8d-29175f4c5153"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.113933 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.168081 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d880eba0-16b8-4949-bf8d-29175f4c5153-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.168152 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf8df\" (UniqueName: \"kubernetes.io/projected/d880eba0-16b8-4949-bf8d-29175f4c5153-kube-api-access-bf8df\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.168183 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.168196 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d880eba0-16b8-4949-bf8d-29175f4c5153-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.269476 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-client-ca\") pod \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.269586 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4445a6-927c-41aa-98ae-9c11d734fdb1-serving-cert\") pod \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.269692 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks7p6\" (UniqueName: \"kubernetes.io/projected/cc4445a6-927c-41aa-98ae-9c11d734fdb1-kube-api-access-ks7p6\") pod \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.269736 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-proxy-ca-bundles\") pod \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.269793 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-config\") pod \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\" (UID: \"cc4445a6-927c-41aa-98ae-9c11d734fdb1\") " Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.270632 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-client-ca" (OuterVolumeSpecName: "client-ca") pod "cc4445a6-927c-41aa-98ae-9c11d734fdb1" (UID: "cc4445a6-927c-41aa-98ae-9c11d734fdb1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.270759 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-config" (OuterVolumeSpecName: "config") pod "cc4445a6-927c-41aa-98ae-9c11d734fdb1" (UID: "cc4445a6-927c-41aa-98ae-9c11d734fdb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.270750 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cc4445a6-927c-41aa-98ae-9c11d734fdb1" (UID: "cc4445a6-927c-41aa-98ae-9c11d734fdb1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.273251 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4445a6-927c-41aa-98ae-9c11d734fdb1-kube-api-access-ks7p6" (OuterVolumeSpecName: "kube-api-access-ks7p6") pod "cc4445a6-927c-41aa-98ae-9c11d734fdb1" (UID: "cc4445a6-927c-41aa-98ae-9c11d734fdb1"). InnerVolumeSpecName "kube-api-access-ks7p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.273305 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4445a6-927c-41aa-98ae-9c11d734fdb1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cc4445a6-927c-41aa-98ae-9c11d734fdb1" (UID: "cc4445a6-927c-41aa-98ae-9c11d734fdb1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.283351 4880 generic.go:334] "Generic (PLEG): container finished" podID="d880eba0-16b8-4949-bf8d-29175f4c5153" containerID="d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461" exitCode=0 Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.283417 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.283417 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" event={"ID":"d880eba0-16b8-4949-bf8d-29175f4c5153","Type":"ContainerDied","Data":"d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461"} Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.283490 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv" event={"ID":"d880eba0-16b8-4949-bf8d-29175f4c5153","Type":"ContainerDied","Data":"2f9cb3f639d7898da5949bc8933345e68ef5d880cca4d3bb47f66c7b20147eed"} Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.283521 4880 scope.go:117] "RemoveContainer" containerID="d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.287204 4880 generic.go:334] "Generic (PLEG): container finished" podID="cc4445a6-927c-41aa-98ae-9c11d734fdb1" containerID="b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d" exitCode=0 Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.287268 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" event={"ID":"cc4445a6-927c-41aa-98ae-9c11d734fdb1","Type":"ContainerDied","Data":"b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d"} Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.287302 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" event={"ID":"cc4445a6-927c-41aa-98ae-9c11d734fdb1","Type":"ContainerDied","Data":"5e7ff8c669c2e09cf8a727fb672df27dcb01b47c1a1ebc68f2e6e619b34d77a9"} Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.287416 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-lvmlf" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.306719 4880 scope.go:117] "RemoveContainer" containerID="d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461" Feb 18 11:57:41 crc kubenswrapper[4880]: E0218 11:57:41.307263 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461\": container with ID starting with d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461 not found: ID does not exist" containerID="d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.307299 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461"} err="failed to get container status \"d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461\": rpc error: code = NotFound desc = could not find container \"d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461\": container with ID starting with d2de1ea1b2527c0811079db9eb1a599618b0218c039d1f55ad60a8fbd5040461 not found: ID does not exist" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.307326 4880 scope.go:117] "RemoveContainer" containerID="b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.308410 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv"] Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.312571 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4zbwv"] Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.327924 4880 scope.go:117] "RemoveContainer" containerID="b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d" Feb 18 11:57:41 crc kubenswrapper[4880]: E0218 11:57:41.328712 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d\": container with ID starting with b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d not found: ID does not exist" containerID="b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.328747 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d"} err="failed to get container status \"b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d\": rpc error: code = NotFound desc = could not find container \"b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d\": container with ID starting with b12db61e65f7bbb15534139301b098df670d3261bfd4237680dd97ebadec105d not found: ID does not exist" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.331287 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-lvmlf"] Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.334288 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-lvmlf"] Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.371485 4880 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4445a6-927c-41aa-98ae-9c11d734fdb1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.371515 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks7p6\" (UniqueName: \"kubernetes.io/projected/cc4445a6-927c-41aa-98ae-9c11d734fdb1-kube-api-access-ks7p6\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.371527 4880 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.371536 4880 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-config\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:41 crc kubenswrapper[4880]: I0218 11:57:41.371545 4880 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc4445a6-927c-41aa-98ae-9c11d734fdb1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.474700 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r"] Feb 18 11:57:42 crc kubenswrapper[4880]: E0218 11:57:42.475843 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d880eba0-16b8-4949-bf8d-29175f4c5153" containerName="route-controller-manager" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.475862 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="d880eba0-16b8-4949-bf8d-29175f4c5153" containerName="route-controller-manager" Feb 18 11:57:42 crc kubenswrapper[4880]: E0218 11:57:42.475920 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4445a6-927c-41aa-98ae-9c11d734fdb1" containerName="controller-manager" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.475933 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4445a6-927c-41aa-98ae-9c11d734fdb1" containerName="controller-manager" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.476071 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4445a6-927c-41aa-98ae-9c11d734fdb1" containerName="controller-manager" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.476091 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="d880eba0-16b8-4949-bf8d-29175f4c5153" containerName="route-controller-manager" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.476940 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.481136 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.481505 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.481825 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.481867 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.481971 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.482330 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.482401 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59f6c65594-js4qx"] Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.483952 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.485906 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.486219 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.487917 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r"] Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.488018 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.488083 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.488323 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.488565 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.495423 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.498295 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59f6c65594-js4qx"] Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.586656 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-proxy-ca-bundles\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.586712 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-config\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.586736 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rckz\" (UniqueName: \"kubernetes.io/projected/eafea59c-1c53-499a-837f-de1636866b7e-kube-api-access-5rckz\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.586756 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafea59c-1c53-499a-837f-de1636866b7e-config\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.586771 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eafea59c-1c53-499a-837f-de1636866b7e-client-ca\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.586805 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-client-ca\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.586854 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5762cae5-6235-4004-b7b8-e88de488ee82-serving-cert\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.587273 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eafea59c-1c53-499a-837f-de1636866b7e-serving-cert\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.587307 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg55c\" (UniqueName: \"kubernetes.io/projected/5762cae5-6235-4004-b7b8-e88de488ee82-kube-api-access-hg55c\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689584 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-proxy-ca-bundles\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689657 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-config\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689689 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rckz\" (UniqueName: \"kubernetes.io/projected/eafea59c-1c53-499a-837f-de1636866b7e-kube-api-access-5rckz\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689707 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafea59c-1c53-499a-837f-de1636866b7e-config\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689724 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eafea59c-1c53-499a-837f-de1636866b7e-client-ca\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689750 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-client-ca\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689765 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5762cae5-6235-4004-b7b8-e88de488ee82-serving-cert\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689794 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eafea59c-1c53-499a-837f-de1636866b7e-serving-cert\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.689811 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg55c\" (UniqueName: \"kubernetes.io/projected/5762cae5-6235-4004-b7b8-e88de488ee82-kube-api-access-hg55c\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.690995 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-proxy-ca-bundles\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.691047 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eafea59c-1c53-499a-837f-de1636866b7e-client-ca\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.691157 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eafea59c-1c53-499a-837f-de1636866b7e-config\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.691410 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-client-ca\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.692286 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5762cae5-6235-4004-b7b8-e88de488ee82-config\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.693918 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eafea59c-1c53-499a-837f-de1636866b7e-serving-cert\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.694018 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5762cae5-6235-4004-b7b8-e88de488ee82-serving-cert\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.706554 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rckz\" (UniqueName: \"kubernetes.io/projected/eafea59c-1c53-499a-837f-de1636866b7e-kube-api-access-5rckz\") pod \"route-controller-manager-5bfbfb75fd-5z77r\" (UID: \"eafea59c-1c53-499a-837f-de1636866b7e\") " pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.709524 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg55c\" (UniqueName: \"kubernetes.io/projected/5762cae5-6235-4004-b7b8-e88de488ee82-kube-api-access-hg55c\") pod \"controller-manager-59f6c65594-js4qx\" (UID: \"5762cae5-6235-4004-b7b8-e88de488ee82\") " pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.803509 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:42 crc kubenswrapper[4880]: I0218 11:57:42.816272 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:43 crc kubenswrapper[4880]: I0218 11:57:43.197827 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4445a6-927c-41aa-98ae-9c11d734fdb1" path="/var/lib/kubelet/pods/cc4445a6-927c-41aa-98ae-9c11d734fdb1/volumes" Feb 18 11:57:43 crc kubenswrapper[4880]: I0218 11:57:43.198769 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d880eba0-16b8-4949-bf8d-29175f4c5153" path="/var/lib/kubelet/pods/d880eba0-16b8-4949-bf8d-29175f4c5153/volumes" Feb 18 11:57:43 crc kubenswrapper[4880]: I0218 11:57:43.199397 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r"] Feb 18 11:57:43 crc kubenswrapper[4880]: W0218 11:57:43.200654 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeafea59c_1c53_499a_837f_de1636866b7e.slice/crio-299520c969be665b3f8259131991af58fa9d414a2d4be9bbf6f978681d0c48f2 WatchSource:0}: Error finding container 299520c969be665b3f8259131991af58fa9d414a2d4be9bbf6f978681d0c48f2: Status 404 returned error can't find the container with id 299520c969be665b3f8259131991af58fa9d414a2d4be9bbf6f978681d0c48f2 Feb 18 11:57:43 crc kubenswrapper[4880]: I0218 11:57:43.236660 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59f6c65594-js4qx"] Feb 18 11:57:43 crc kubenswrapper[4880]: W0218 11:57:43.256534 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5762cae5_6235_4004_b7b8_e88de488ee82.slice/crio-13323747f75fbe05dd9dca752bfd51bb308af79ed4f0725b7754103a5dd1890c WatchSource:0}: Error finding container 13323747f75fbe05dd9dca752bfd51bb308af79ed4f0725b7754103a5dd1890c: Status 404 returned error can't find the container with id 13323747f75fbe05dd9dca752bfd51bb308af79ed4f0725b7754103a5dd1890c Feb 18 11:57:43 crc kubenswrapper[4880]: I0218 11:57:43.300267 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" event={"ID":"eafea59c-1c53-499a-837f-de1636866b7e","Type":"ContainerStarted","Data":"299520c969be665b3f8259131991af58fa9d414a2d4be9bbf6f978681d0c48f2"} Feb 18 11:57:43 crc kubenswrapper[4880]: I0218 11:57:43.300996 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" event={"ID":"5762cae5-6235-4004-b7b8-e88de488ee82","Type":"ContainerStarted","Data":"13323747f75fbe05dd9dca752bfd51bb308af79ed4f0725b7754103a5dd1890c"} Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.310389 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" event={"ID":"5762cae5-6235-4004-b7b8-e88de488ee82","Type":"ContainerStarted","Data":"706456f0c49eddfe98580c162cb3e8b873a79ba7c1f53c2748f0ab8cdb0ea94b"} Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.310826 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.313556 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" event={"ID":"eafea59c-1c53-499a-837f-de1636866b7e","Type":"ContainerStarted","Data":"8bd85b068e0f5a273262c50477e4484dab4cde43dc15254cb02a31a4b5d9d04f"} Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.313817 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.318007 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.318664 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.352119 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59f6c65594-js4qx" podStartSLOduration=4.352092105 podStartE2EDuration="4.352092105s" podCreationTimestamp="2026-02-18 11:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:44.332923049 +0000 UTC m=+371.761823910" watchObservedRunningTime="2026-02-18 11:57:44.352092105 +0000 UTC m=+371.780992986" Feb 18 11:57:44 crc kubenswrapper[4880]: I0218 11:57:44.352833 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bfbfb75fd-5z77r" podStartSLOduration=4.352828135 podStartE2EDuration="4.352828135s" podCreationTimestamp="2026-02-18 11:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:44.347855479 +0000 UTC m=+371.776756340" watchObservedRunningTime="2026-02-18 11:57:44.352828135 +0000 UTC m=+371.781728986" Feb 18 11:57:49 crc kubenswrapper[4880]: I0218 11:57:49.899439 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-92z9r"] Feb 18 11:57:49 crc kubenswrapper[4880]: I0218 11:57:49.904650 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:49 crc kubenswrapper[4880]: I0218 11:57:49.914736 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 11:57:49 crc kubenswrapper[4880]: I0218 11:57:49.917338 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92z9r"] Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.083218 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3fa583-5b51-4f41-9c12-71e0c018be1e-utilities\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.083293 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3fa583-5b51-4f41-9c12-71e0c018be1e-catalog-content\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.083334 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk6dk\" (UniqueName: \"kubernetes.io/projected/3e3fa583-5b51-4f41-9c12-71e0c018be1e-kube-api-access-vk6dk\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.184750 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3fa583-5b51-4f41-9c12-71e0c018be1e-catalog-content\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.184853 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk6dk\" (UniqueName: \"kubernetes.io/projected/3e3fa583-5b51-4f41-9c12-71e0c018be1e-kube-api-access-vk6dk\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.185398 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e3fa583-5b51-4f41-9c12-71e0c018be1e-catalog-content\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.185442 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3fa583-5b51-4f41-9c12-71e0c018be1e-utilities\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.185493 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e3fa583-5b51-4f41-9c12-71e0c018be1e-utilities\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.209629 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk6dk\" (UniqueName: \"kubernetes.io/projected/3e3fa583-5b51-4f41-9c12-71e0c018be1e-kube-api-access-vk6dk\") pod \"community-operators-92z9r\" (UID: \"3e3fa583-5b51-4f41-9c12-71e0c018be1e\") " pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.232460 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.649578 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92z9r"] Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.894376 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdgb8"] Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.895856 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.900718 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 11:57:50 crc kubenswrapper[4880]: I0218 11:57:50.907136 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdgb8"] Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.096825 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xjx5\" (UniqueName: \"kubernetes.io/projected/679e3cd9-43cb-4010-93ce-d12efd5fc702-kube-api-access-5xjx5\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.097133 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679e3cd9-43cb-4010-93ce-d12efd5fc702-utilities\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.097195 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679e3cd9-43cb-4010-93ce-d12efd5fc702-catalog-content\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.198043 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679e3cd9-43cb-4010-93ce-d12efd5fc702-catalog-content\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.198163 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xjx5\" (UniqueName: \"kubernetes.io/projected/679e3cd9-43cb-4010-93ce-d12efd5fc702-kube-api-access-5xjx5\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.198199 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679e3cd9-43cb-4010-93ce-d12efd5fc702-utilities\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.198725 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679e3cd9-43cb-4010-93ce-d12efd5fc702-catalog-content\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.198778 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679e3cd9-43cb-4010-93ce-d12efd5fc702-utilities\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.217390 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xjx5\" (UniqueName: \"kubernetes.io/projected/679e3cd9-43cb-4010-93ce-d12efd5fc702-kube-api-access-5xjx5\") pod \"certified-operators-hdgb8\" (UID: \"679e3cd9-43cb-4010-93ce-d12efd5fc702\") " pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.219844 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.338535 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fcstk"] Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.339659 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.345048 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fcstk"] Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.370165 4880 generic.go:334] "Generic (PLEG): container finished" podID="3e3fa583-5b51-4f41-9c12-71e0c018be1e" containerID="ff464d01b52db09d8af803732a44309934b989e471177230a23f28a37233c03e" exitCode=0 Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.370279 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92z9r" event={"ID":"3e3fa583-5b51-4f41-9c12-71e0c018be1e","Type":"ContainerDied","Data":"ff464d01b52db09d8af803732a44309934b989e471177230a23f28a37233c03e"} Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.370651 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92z9r" event={"ID":"3e3fa583-5b51-4f41-9c12-71e0c018be1e","Type":"ContainerStarted","Data":"41e06a47975669463f9c588debde1ded1d06402884734a94c074907943e2aa9c"} Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502135 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3825aef-9b37-486b-8324-0f6d9c8b1d66-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502185 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4tx\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-kube-api-access-vd4tx\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502212 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3825aef-9b37-486b-8324-0f6d9c8b1d66-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502265 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3825aef-9b37-486b-8324-0f6d9c8b1d66-trusted-ca\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502298 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3825aef-9b37-486b-8324-0f6d9c8b1d66-registry-certificates\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502332 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502482 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-bound-sa-token\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.502590 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-registry-tls\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.523308 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.603327 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-registry-tls\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.603402 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3825aef-9b37-486b-8324-0f6d9c8b1d66-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.603428 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4tx\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-kube-api-access-vd4tx\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.603449 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3825aef-9b37-486b-8324-0f6d9c8b1d66-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.603468 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3825aef-9b37-486b-8324-0f6d9c8b1d66-trusted-ca\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.603489 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3825aef-9b37-486b-8324-0f6d9c8b1d66-registry-certificates\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.603526 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-bound-sa-token\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.604218 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3825aef-9b37-486b-8324-0f6d9c8b1d66-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.604864 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3825aef-9b37-486b-8324-0f6d9c8b1d66-trusted-ca\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.605060 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3825aef-9b37-486b-8324-0f6d9c8b1d66-registry-certificates\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.609959 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-registry-tls\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.619148 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3825aef-9b37-486b-8324-0f6d9c8b1d66-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.620799 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4tx\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-kube-api-access-vd4tx\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.622389 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3825aef-9b37-486b-8324-0f6d9c8b1d66-bound-sa-token\") pod \"image-registry-66df7c8f76-fcstk\" (UID: \"b3825aef-9b37-486b-8324-0f6d9c8b1d66\") " pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.660051 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdgb8"] Feb 18 11:57:51 crc kubenswrapper[4880]: I0218 11:57:51.669526 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.072869 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fcstk"] Feb 18 11:57:52 crc kubenswrapper[4880]: W0218 11:57:52.087323 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3825aef_9b37_486b_8324_0f6d9c8b1d66.slice/crio-58006625db05ed5a550f82dbfe7bd8d718176dcd5761059f9f80bb0b77f30497 WatchSource:0}: Error finding container 58006625db05ed5a550f82dbfe7bd8d718176dcd5761059f9f80bb0b77f30497: Status 404 returned error can't find the container with id 58006625db05ed5a550f82dbfe7bd8d718176dcd5761059f9f80bb0b77f30497 Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.290061 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l2szd"] Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.291373 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.293398 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.305006 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2szd"] Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.380418 4880 generic.go:334] "Generic (PLEG): container finished" podID="679e3cd9-43cb-4010-93ce-d12efd5fc702" containerID="043a590ec1b668ef28c6ae58f90d143ad7c5ebedf789ae9afcd06137cc44282b" exitCode=0 Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.380500 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdgb8" event={"ID":"679e3cd9-43cb-4010-93ce-d12efd5fc702","Type":"ContainerDied","Data":"043a590ec1b668ef28c6ae58f90d143ad7c5ebedf789ae9afcd06137cc44282b"} Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.380532 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdgb8" event={"ID":"679e3cd9-43cb-4010-93ce-d12efd5fc702","Type":"ContainerStarted","Data":"bbf75de9b2cb539225a7defba759547c03f9a5554c034d4aedce870c759888a7"} Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.385235 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92z9r" event={"ID":"3e3fa583-5b51-4f41-9c12-71e0c018be1e","Type":"ContainerStarted","Data":"53c5a5dc9959b564f63ec1fba795f481c9806c31958a506a4694af23404425b7"} Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.389311 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" event={"ID":"b3825aef-9b37-486b-8324-0f6d9c8b1d66","Type":"ContainerStarted","Data":"bb993846400ea8b0ba644796c109811a68a3ce916144c040a97cbe9e57e9acb9"} Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.389365 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" event={"ID":"b3825aef-9b37-486b-8324-0f6d9c8b1d66","Type":"ContainerStarted","Data":"58006625db05ed5a550f82dbfe7bd8d718176dcd5761059f9f80bb0b77f30497"} Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.389449 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.442573 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" podStartSLOduration=1.442550068 podStartE2EDuration="1.442550068s" podCreationTimestamp="2026-02-18 11:57:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 11:57:52.436089741 +0000 UTC m=+379.864990622" watchObservedRunningTime="2026-02-18 11:57:52.442550068 +0000 UTC m=+379.871450929" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.452329 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba267750-47ef-4ac0-ad93-d82731ca5b8b-utilities\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.452415 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbd46\" (UniqueName: \"kubernetes.io/projected/ba267750-47ef-4ac0-ad93-d82731ca5b8b-kube-api-access-zbd46\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.452800 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba267750-47ef-4ac0-ad93-d82731ca5b8b-catalog-content\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.553811 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbd46\" (UniqueName: \"kubernetes.io/projected/ba267750-47ef-4ac0-ad93-d82731ca5b8b-kube-api-access-zbd46\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.553913 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba267750-47ef-4ac0-ad93-d82731ca5b8b-catalog-content\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.554019 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba267750-47ef-4ac0-ad93-d82731ca5b8b-utilities\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.554484 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba267750-47ef-4ac0-ad93-d82731ca5b8b-utilities\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.555353 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba267750-47ef-4ac0-ad93-d82731ca5b8b-catalog-content\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.579970 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbd46\" (UniqueName: \"kubernetes.io/projected/ba267750-47ef-4ac0-ad93-d82731ca5b8b-kube-api-access-zbd46\") pod \"redhat-marketplace-l2szd\" (UID: \"ba267750-47ef-4ac0-ad93-d82731ca5b8b\") " pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:52 crc kubenswrapper[4880]: I0218 11:57:52.673540 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.070044 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2szd"] Feb 18 11:57:53 crc kubenswrapper[4880]: W0218 11:57:53.090072 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba267750_47ef_4ac0_ad93_d82731ca5b8b.slice/crio-9a135a495bb129e329ca45830c23d5b41f86e4970bbef629aea5090661609075 WatchSource:0}: Error finding container 9a135a495bb129e329ca45830c23d5b41f86e4970bbef629aea5090661609075: Status 404 returned error can't find the container with id 9a135a495bb129e329ca45830c23d5b41f86e4970bbef629aea5090661609075 Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.274120 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.274396 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.293287 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjvdn"] Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.294906 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.296946 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.301357 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjvdn"] Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.395514 4880 generic.go:334] "Generic (PLEG): container finished" podID="ba267750-47ef-4ac0-ad93-d82731ca5b8b" containerID="0c5913a06c9635d0fb780f49b3cbda271e58986780bfc277f926c9dbf56bbac8" exitCode=0 Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.395578 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2szd" event={"ID":"ba267750-47ef-4ac0-ad93-d82731ca5b8b","Type":"ContainerDied","Data":"0c5913a06c9635d0fb780f49b3cbda271e58986780bfc277f926c9dbf56bbac8"} Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.395625 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2szd" event={"ID":"ba267750-47ef-4ac0-ad93-d82731ca5b8b","Type":"ContainerStarted","Data":"9a135a495bb129e329ca45830c23d5b41f86e4970bbef629aea5090661609075"} Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.400882 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdgb8" event={"ID":"679e3cd9-43cb-4010-93ce-d12efd5fc702","Type":"ContainerStarted","Data":"3fa4154e28d0416fd8329d2f8bf77a6f31667634dad2df666ffc1d8d5f8dc01f"} Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.404054 4880 generic.go:334] "Generic (PLEG): container finished" podID="3e3fa583-5b51-4f41-9c12-71e0c018be1e" containerID="53c5a5dc9959b564f63ec1fba795f481c9806c31958a506a4694af23404425b7" exitCode=0 Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.404924 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92z9r" event={"ID":"3e3fa583-5b51-4f41-9c12-71e0c018be1e","Type":"ContainerDied","Data":"53c5a5dc9959b564f63ec1fba795f481c9806c31958a506a4694af23404425b7"} Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.471636 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-utilities\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.471784 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-catalog-content\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.471821 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq779\" (UniqueName: \"kubernetes.io/projected/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-kube-api-access-rq779\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.572697 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-utilities\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.572772 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-catalog-content\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.572790 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq779\" (UniqueName: \"kubernetes.io/projected/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-kube-api-access-rq779\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.574206 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-utilities\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.574739 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-catalog-content\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.600663 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq779\" (UniqueName: \"kubernetes.io/projected/a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64-kube-api-access-rq779\") pod \"redhat-operators-gjvdn\" (UID: \"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64\") " pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:53 crc kubenswrapper[4880]: I0218 11:57:53.666397 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:57:54 crc kubenswrapper[4880]: I0218 11:57:54.118695 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjvdn"] Feb 18 11:57:54 crc kubenswrapper[4880]: I0218 11:57:54.412861 4880 generic.go:334] "Generic (PLEG): container finished" podID="679e3cd9-43cb-4010-93ce-d12efd5fc702" containerID="3fa4154e28d0416fd8329d2f8bf77a6f31667634dad2df666ffc1d8d5f8dc01f" exitCode=0 Feb 18 11:57:54 crc kubenswrapper[4880]: I0218 11:57:54.412949 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdgb8" event={"ID":"679e3cd9-43cb-4010-93ce-d12efd5fc702","Type":"ContainerDied","Data":"3fa4154e28d0416fd8329d2f8bf77a6f31667634dad2df666ffc1d8d5f8dc01f"} Feb 18 11:57:54 crc kubenswrapper[4880]: I0218 11:57:54.425957 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92z9r" event={"ID":"3e3fa583-5b51-4f41-9c12-71e0c018be1e","Type":"ContainerStarted","Data":"9b59b4c0254ae2d6e93fb9e1e09a677a22eaddf3135a9ae1ea700f759d00d2f2"} Feb 18 11:57:54 crc kubenswrapper[4880]: I0218 11:57:54.427761 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjvdn" event={"ID":"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64","Type":"ContainerStarted","Data":"54792a511c615910550d8e72fe74d18d98b4ae3af84e7fde27e789969948a51f"} Feb 18 11:57:54 crc kubenswrapper[4880]: I0218 11:57:54.454596 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-92z9r" podStartSLOduration=2.941035739 podStartE2EDuration="5.454573769s" podCreationTimestamp="2026-02-18 11:57:49 +0000 UTC" firstStartedPulling="2026-02-18 11:57:51.373045477 +0000 UTC m=+378.801946338" lastFinishedPulling="2026-02-18 11:57:53.886583507 +0000 UTC m=+381.315484368" observedRunningTime="2026-02-18 11:57:54.451874155 +0000 UTC m=+381.880775016" watchObservedRunningTime="2026-02-18 11:57:54.454573769 +0000 UTC m=+381.883474630" Feb 18 11:57:55 crc kubenswrapper[4880]: I0218 11:57:55.445909 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdgb8" event={"ID":"679e3cd9-43cb-4010-93ce-d12efd5fc702","Type":"ContainerStarted","Data":"7be996fd0f24dfbfd345e5802dfcab1deb99462c2e4496e629e3d7b5112afdcc"} Feb 18 11:57:55 crc kubenswrapper[4880]: I0218 11:57:55.459243 4880 generic.go:334] "Generic (PLEG): container finished" podID="a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64" containerID="6bd7f43f326b705ce7dcdab7818bdb65e90c1a455680411aa4630ccce974b0ae" exitCode=0 Feb 18 11:57:55 crc kubenswrapper[4880]: I0218 11:57:55.459342 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjvdn" event={"ID":"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64","Type":"ContainerDied","Data":"6bd7f43f326b705ce7dcdab7818bdb65e90c1a455680411aa4630ccce974b0ae"} Feb 18 11:57:55 crc kubenswrapper[4880]: I0218 11:57:55.462013 4880 generic.go:334] "Generic (PLEG): container finished" podID="ba267750-47ef-4ac0-ad93-d82731ca5b8b" containerID="b95548c671e3973cb17a67c3c5865c51e4fe9609aeea1d357391bc8eb364ee45" exitCode=0 Feb 18 11:57:55 crc kubenswrapper[4880]: I0218 11:57:55.462921 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2szd" event={"ID":"ba267750-47ef-4ac0-ad93-d82731ca5b8b","Type":"ContainerDied","Data":"b95548c671e3973cb17a67c3c5865c51e4fe9609aeea1d357391bc8eb364ee45"} Feb 18 11:57:55 crc kubenswrapper[4880]: I0218 11:57:55.469048 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdgb8" podStartSLOduration=2.760065503 podStartE2EDuration="5.469028101s" podCreationTimestamp="2026-02-18 11:57:50 +0000 UTC" firstStartedPulling="2026-02-18 11:57:52.382354538 +0000 UTC m=+379.811255409" lastFinishedPulling="2026-02-18 11:57:55.091317146 +0000 UTC m=+382.520218007" observedRunningTime="2026-02-18 11:57:55.464937889 +0000 UTC m=+382.893838760" watchObservedRunningTime="2026-02-18 11:57:55.469028101 +0000 UTC m=+382.897928962" Feb 18 11:57:57 crc kubenswrapper[4880]: I0218 11:57:57.474206 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2szd" event={"ID":"ba267750-47ef-4ac0-ad93-d82731ca5b8b","Type":"ContainerStarted","Data":"49cfe965e5344ca4939d3b16a8371fbb7e75b0da1f9e0be11e0b85c615041cb6"} Feb 18 11:57:57 crc kubenswrapper[4880]: I0218 11:57:57.476002 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjvdn" event={"ID":"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64","Type":"ContainerStarted","Data":"ca02e69518c98bf718ded4f299999ac47e882cdc459beb4bca0633fb7e8c00a0"} Feb 18 11:57:57 crc kubenswrapper[4880]: I0218 11:57:57.497389 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l2szd" podStartSLOduration=3.023327052 podStartE2EDuration="5.497304157s" podCreationTimestamp="2026-02-18 11:57:52 +0000 UTC" firstStartedPulling="2026-02-18 11:57:53.397026886 +0000 UTC m=+380.825927747" lastFinishedPulling="2026-02-18 11:57:55.871003991 +0000 UTC m=+383.299904852" observedRunningTime="2026-02-18 11:57:57.493626616 +0000 UTC m=+384.922527487" watchObservedRunningTime="2026-02-18 11:57:57.497304157 +0000 UTC m=+384.926205018" Feb 18 11:57:58 crc kubenswrapper[4880]: I0218 11:57:58.483302 4880 generic.go:334] "Generic (PLEG): container finished" podID="a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64" containerID="ca02e69518c98bf718ded4f299999ac47e882cdc459beb4bca0633fb7e8c00a0" exitCode=0 Feb 18 11:57:58 crc kubenswrapper[4880]: I0218 11:57:58.483414 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjvdn" event={"ID":"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64","Type":"ContainerDied","Data":"ca02e69518c98bf718ded4f299999ac47e882cdc459beb4bca0633fb7e8c00a0"} Feb 18 11:57:59 crc kubenswrapper[4880]: I0218 11:57:59.505283 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjvdn" event={"ID":"a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64","Type":"ContainerStarted","Data":"a0c2e87192bbc3a8d76581f02da0491893b4324c87c65c2b12609c5f2658cd2b"} Feb 18 11:57:59 crc kubenswrapper[4880]: I0218 11:57:59.522481 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjvdn" podStartSLOduration=3.082272503 podStartE2EDuration="6.522418387s" podCreationTimestamp="2026-02-18 11:57:53 +0000 UTC" firstStartedPulling="2026-02-18 11:57:55.463647043 +0000 UTC m=+382.892547904" lastFinishedPulling="2026-02-18 11:57:58.903792927 +0000 UTC m=+386.332693788" observedRunningTime="2026-02-18 11:57:59.520529954 +0000 UTC m=+386.949430815" watchObservedRunningTime="2026-02-18 11:57:59.522418387 +0000 UTC m=+386.951319248" Feb 18 11:58:00 crc kubenswrapper[4880]: I0218 11:58:00.233637 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:58:00 crc kubenswrapper[4880]: I0218 11:58:00.233697 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:58:00 crc kubenswrapper[4880]: I0218 11:58:00.272492 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:58:00 crc kubenswrapper[4880]: I0218 11:58:00.554391 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-92z9r" Feb 18 11:58:01 crc kubenswrapper[4880]: I0218 11:58:01.220710 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:58:01 crc kubenswrapper[4880]: I0218 11:58:01.220765 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:58:01 crc kubenswrapper[4880]: I0218 11:58:01.258894 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:58:01 crc kubenswrapper[4880]: I0218 11:58:01.550716 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdgb8" Feb 18 11:58:02 crc kubenswrapper[4880]: I0218 11:58:02.673792 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:58:02 crc kubenswrapper[4880]: I0218 11:58:02.673855 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:58:02 crc kubenswrapper[4880]: I0218 11:58:02.710288 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:58:03 crc kubenswrapper[4880]: I0218 11:58:03.578243 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2szd" Feb 18 11:58:03 crc kubenswrapper[4880]: I0218 11:58:03.709246 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:58:03 crc kubenswrapper[4880]: I0218 11:58:03.710704 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:58:04 crc kubenswrapper[4880]: I0218 11:58:04.753690 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjvdn" podUID="a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64" containerName="registry-server" probeResult="failure" output=< Feb 18 11:58:04 crc kubenswrapper[4880]: timeout: failed to connect service ":50051" within 1s Feb 18 11:58:04 crc kubenswrapper[4880]: > Feb 18 11:58:11 crc kubenswrapper[4880]: I0218 11:58:11.676096 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fcstk" Feb 18 11:58:11 crc kubenswrapper[4880]: I0218 11:58:11.734196 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bj57p"] Feb 18 11:58:13 crc kubenswrapper[4880]: I0218 11:58:13.704557 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:58:13 crc kubenswrapper[4880]: I0218 11:58:13.742580 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjvdn" Feb 18 11:58:23 crc kubenswrapper[4880]: I0218 11:58:23.274190 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:58:23 crc kubenswrapper[4880]: I0218 11:58:23.275257 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:58:36 crc kubenswrapper[4880]: I0218 11:58:36.767639 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" podUID="a7f1515f-71f7-47dc-b429-1ce17401b9ea" containerName="registry" containerID="cri-o://ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7" gracePeriod=30 Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.212215 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.326068 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-bound-sa-token\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.326197 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzrxh\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-kube-api-access-vzrxh\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.326305 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-certificates\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.326431 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f1515f-71f7-47dc-b429-1ce17401b9ea-ca-trust-extracted\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.326532 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-tls\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.326600 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-trusted-ca\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.326848 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f1515f-71f7-47dc-b429-1ce17401b9ea-installation-pull-secrets\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.327140 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\" (UID: \"a7f1515f-71f7-47dc-b429-1ce17401b9ea\") " Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.327622 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.327683 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.337309 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.337694 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-kube-api-access-vzrxh" (OuterVolumeSpecName: "kube-api-access-vzrxh") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "kube-api-access-vzrxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.338952 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.340121 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f1515f-71f7-47dc-b429-1ce17401b9ea-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.344826 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f1515f-71f7-47dc-b429-1ce17401b9ea-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.370921 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a7f1515f-71f7-47dc-b429-1ce17401b9ea" (UID: "a7f1515f-71f7-47dc-b429-1ce17401b9ea"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.429583 4880 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7f1515f-71f7-47dc-b429-1ce17401b9ea-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.429657 4880 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.429668 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.429678 4880 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7f1515f-71f7-47dc-b429-1ce17401b9ea-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.429692 4880 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.429703 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzrxh\" (UniqueName: \"kubernetes.io/projected/a7f1515f-71f7-47dc-b429-1ce17401b9ea-kube-api-access-vzrxh\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.429715 4880 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7f1515f-71f7-47dc-b429-1ce17401b9ea-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.728933 4880 generic.go:334] "Generic (PLEG): container finished" podID="a7f1515f-71f7-47dc-b429-1ce17401b9ea" containerID="ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7" exitCode=0 Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.729002 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" event={"ID":"a7f1515f-71f7-47dc-b429-1ce17401b9ea","Type":"ContainerDied","Data":"ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7"} Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.729058 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.729084 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bj57p" event={"ID":"a7f1515f-71f7-47dc-b429-1ce17401b9ea","Type":"ContainerDied","Data":"f822f50dd490787ff4a1b5b605fa1230e42285702a05a3b34552a225ac9ecc6e"} Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.729135 4880 scope.go:117] "RemoveContainer" containerID="ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.767798 4880 scope.go:117] "RemoveContainer" containerID="ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7" Feb 18 11:58:37 crc kubenswrapper[4880]: E0218 11:58:37.768548 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7\": container with ID starting with ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7 not found: ID does not exist" containerID="ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.768677 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7"} err="failed to get container status \"ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7\": rpc error: code = NotFound desc = could not find container \"ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7\": container with ID starting with ac9a2419656e234c22f16ad4fac6a38c285517cf6ca5498f4fceffad04799bc7 not found: ID does not exist" Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.777554 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bj57p"] Feb 18 11:58:37 crc kubenswrapper[4880]: I0218 11:58:37.782545 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bj57p"] Feb 18 11:58:39 crc kubenswrapper[4880]: I0218 11:58:39.188452 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f1515f-71f7-47dc-b429-1ce17401b9ea" path="/var/lib/kubelet/pods/a7f1515f-71f7-47dc-b429-1ce17401b9ea/volumes" Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.274568 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.277720 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.277959 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.279531 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60fb6b1e2c099ffdb6e6dd1164357f5a1d76a11dcc81d2d4ad9a7cfb6689e88e"} pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.279815 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" containerID="cri-o://60fb6b1e2c099ffdb6e6dd1164357f5a1d76a11dcc81d2d4ad9a7cfb6689e88e" gracePeriod=600 Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.841054 4880 generic.go:334] "Generic (PLEG): container finished" podID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerID="60fb6b1e2c099ffdb6e6dd1164357f5a1d76a11dcc81d2d4ad9a7cfb6689e88e" exitCode=0 Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.841152 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerDied","Data":"60fb6b1e2c099ffdb6e6dd1164357f5a1d76a11dcc81d2d4ad9a7cfb6689e88e"} Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.841721 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"ed8d7757e9e56a505647e83dce073d85ff43cb245ee9dfe28ed0b00f01fd6740"} Feb 18 11:58:53 crc kubenswrapper[4880]: I0218 11:58:53.841764 4880 scope.go:117] "RemoveContainer" containerID="ad50a46aebf61e4f885d3d25630719c084ae33e58c50ecc38fe961f31b6efc82" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.174332 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz"] Feb 18 12:00:00 crc kubenswrapper[4880]: E0218 12:00:00.175173 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f1515f-71f7-47dc-b429-1ce17401b9ea" containerName="registry" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.175189 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f1515f-71f7-47dc-b429-1ce17401b9ea" containerName="registry" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.175296 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f1515f-71f7-47dc-b429-1ce17401b9ea" containerName="registry" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.175718 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.178085 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.178700 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.189254 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz"] Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.314193 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc977c04-f231-4e06-8dae-afad308fc4db-config-volume\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.314775 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsw6\" (UniqueName: \"kubernetes.io/projected/bc977c04-f231-4e06-8dae-afad308fc4db-kube-api-access-qpsw6\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.314957 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc977c04-f231-4e06-8dae-afad308fc4db-secret-volume\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.416303 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc977c04-f231-4e06-8dae-afad308fc4db-config-volume\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.416408 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsw6\" (UniqueName: \"kubernetes.io/projected/bc977c04-f231-4e06-8dae-afad308fc4db-kube-api-access-qpsw6\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.416454 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc977c04-f231-4e06-8dae-afad308fc4db-secret-volume\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.417673 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc977c04-f231-4e06-8dae-afad308fc4db-config-volume\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.427184 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc977c04-f231-4e06-8dae-afad308fc4db-secret-volume\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.436401 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsw6\" (UniqueName: \"kubernetes.io/projected/bc977c04-f231-4e06-8dae-afad308fc4db-kube-api-access-qpsw6\") pod \"collect-profiles-29523600-xs2mz\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.502472 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:00 crc kubenswrapper[4880]: I0218 12:00:00.732361 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz"] Feb 18 12:00:01 crc kubenswrapper[4880]: I0218 12:00:01.262895 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" event={"ID":"bc977c04-f231-4e06-8dae-afad308fc4db","Type":"ContainerStarted","Data":"a9ac8d8d5ae794c661e1906313c9ec0e5d7d2011ce5824f0aca85b09d9383be0"} Feb 18 12:00:01 crc kubenswrapper[4880]: I0218 12:00:01.262963 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" event={"ID":"bc977c04-f231-4e06-8dae-afad308fc4db","Type":"ContainerStarted","Data":"3950d624b5dcce7893ed027f827f56d6a4ba8634300061d5500e5d3ac9a83456"} Feb 18 12:00:01 crc kubenswrapper[4880]: I0218 12:00:01.292348 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" podStartSLOduration=1.292325715 podStartE2EDuration="1.292325715s" podCreationTimestamp="2026-02-18 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:00:01.290140865 +0000 UTC m=+508.719041726" watchObservedRunningTime="2026-02-18 12:00:01.292325715 +0000 UTC m=+508.721226586" Feb 18 12:00:02 crc kubenswrapper[4880]: I0218 12:00:02.270295 4880 generic.go:334] "Generic (PLEG): container finished" podID="bc977c04-f231-4e06-8dae-afad308fc4db" containerID="a9ac8d8d5ae794c661e1906313c9ec0e5d7d2011ce5824f0aca85b09d9383be0" exitCode=0 Feb 18 12:00:02 crc kubenswrapper[4880]: I0218 12:00:02.270391 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" event={"ID":"bc977c04-f231-4e06-8dae-afad308fc4db","Type":"ContainerDied","Data":"a9ac8d8d5ae794c661e1906313c9ec0e5d7d2011ce5824f0aca85b09d9383be0"} Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.504270 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.663185 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpsw6\" (UniqueName: \"kubernetes.io/projected/bc977c04-f231-4e06-8dae-afad308fc4db-kube-api-access-qpsw6\") pod \"bc977c04-f231-4e06-8dae-afad308fc4db\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.663335 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc977c04-f231-4e06-8dae-afad308fc4db-config-volume\") pod \"bc977c04-f231-4e06-8dae-afad308fc4db\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.663497 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc977c04-f231-4e06-8dae-afad308fc4db-secret-volume\") pod \"bc977c04-f231-4e06-8dae-afad308fc4db\" (UID: \"bc977c04-f231-4e06-8dae-afad308fc4db\") " Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.664524 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc977c04-f231-4e06-8dae-afad308fc4db-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc977c04-f231-4e06-8dae-afad308fc4db" (UID: "bc977c04-f231-4e06-8dae-afad308fc4db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.670602 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc977c04-f231-4e06-8dae-afad308fc4db-kube-api-access-qpsw6" (OuterVolumeSpecName: "kube-api-access-qpsw6") pod "bc977c04-f231-4e06-8dae-afad308fc4db" (UID: "bc977c04-f231-4e06-8dae-afad308fc4db"). InnerVolumeSpecName "kube-api-access-qpsw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.670837 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc977c04-f231-4e06-8dae-afad308fc4db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc977c04-f231-4e06-8dae-afad308fc4db" (UID: "bc977c04-f231-4e06-8dae-afad308fc4db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.765275 4880 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc977c04-f231-4e06-8dae-afad308fc4db-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.765332 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpsw6\" (UniqueName: \"kubernetes.io/projected/bc977c04-f231-4e06-8dae-afad308fc4db-kube-api-access-qpsw6\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:03 crc kubenswrapper[4880]: I0218 12:00:03.765349 4880 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc977c04-f231-4e06-8dae-afad308fc4db-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:00:04 crc kubenswrapper[4880]: I0218 12:00:04.284601 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" event={"ID":"bc977c04-f231-4e06-8dae-afad308fc4db","Type":"ContainerDied","Data":"3950d624b5dcce7893ed027f827f56d6a4ba8634300061d5500e5d3ac9a83456"} Feb 18 12:00:04 crc kubenswrapper[4880]: I0218 12:00:04.284673 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3950d624b5dcce7893ed027f827f56d6a4ba8634300061d5500e5d3ac9a83456" Feb 18 12:00:04 crc kubenswrapper[4880]: I0218 12:00:04.284998 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523600-xs2mz" Feb 18 12:00:53 crc kubenswrapper[4880]: I0218 12:00:53.274533 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:00:53 crc kubenswrapper[4880]: I0218 12:00:53.275152 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:01:23 crc kubenswrapper[4880]: I0218 12:01:23.274363 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:01:23 crc kubenswrapper[4880]: I0218 12:01:23.277343 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.274672 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.275202 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.275245 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.275755 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed8d7757e9e56a505647e83dce073d85ff43cb245ee9dfe28ed0b00f01fd6740"} pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.275799 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" containerID="cri-o://ed8d7757e9e56a505647e83dce073d85ff43cb245ee9dfe28ed0b00f01fd6740" gracePeriod=600 Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.940737 4880 generic.go:334] "Generic (PLEG): container finished" podID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerID="ed8d7757e9e56a505647e83dce073d85ff43cb245ee9dfe28ed0b00f01fd6740" exitCode=0 Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.940808 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerDied","Data":"ed8d7757e9e56a505647e83dce073d85ff43cb245ee9dfe28ed0b00f01fd6740"} Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.941154 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"ffb3f920a68020d12bee188c9effd93292465486a03ad6482f8c49de81c5a836"} Feb 18 12:01:53 crc kubenswrapper[4880]: I0218 12:01:53.941172 4880 scope.go:117] "RemoveContainer" containerID="60fb6b1e2c099ffdb6e6dd1164357f5a1d76a11dcc81d2d4ad9a7cfb6689e88e" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.919978 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-p4bzm"] Feb 18 12:01:58 crc kubenswrapper[4880]: E0218 12:01:58.921264 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc977c04-f231-4e06-8dae-afad308fc4db" containerName="collect-profiles" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.921284 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc977c04-f231-4e06-8dae-afad308fc4db" containerName="collect-profiles" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.921421 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc977c04-f231-4e06-8dae-afad308fc4db" containerName="collect-profiles" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.922039 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-p4bzm" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.923782 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt"] Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.924743 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.925736 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.926743 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.935301 4880 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-cz9ph" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.939115 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-p4bzm"] Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.943697 4880 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bgnkn" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.949019 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt"] Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.952111 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5jz97"] Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.952833 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.955057 4880 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-stznk" Feb 18 12:01:58 crc kubenswrapper[4880]: I0218 12:01:58.977147 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5jz97"] Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.021908 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79hmd\" (UniqueName: \"kubernetes.io/projected/9d5b3735-fda7-4348-bd36-ffde74bed4d5-kube-api-access-79hmd\") pod \"cert-manager-858654f9db-p4bzm\" (UID: \"9d5b3735-fda7-4348-bd36-ffde74bed4d5\") " pod="cert-manager/cert-manager-858654f9db-p4bzm" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.021990 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s2bg\" (UniqueName: \"kubernetes.io/projected/73158f7c-619d-4202-ae40-543a797efbf2-kube-api-access-7s2bg\") pod \"cert-manager-cainjector-cf98fcc89-lxkwt\" (UID: \"73158f7c-619d-4202-ae40-543a797efbf2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.123248 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptdg\" (UniqueName: \"kubernetes.io/projected/3412227f-d74c-4cdd-a225-e2eaa1d64ad0-kube-api-access-fptdg\") pod \"cert-manager-webhook-687f57d79b-5jz97\" (UID: \"3412227f-d74c-4cdd-a225-e2eaa1d64ad0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.123307 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s2bg\" (UniqueName: \"kubernetes.io/projected/73158f7c-619d-4202-ae40-543a797efbf2-kube-api-access-7s2bg\") pod \"cert-manager-cainjector-cf98fcc89-lxkwt\" (UID: \"73158f7c-619d-4202-ae40-543a797efbf2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.123369 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79hmd\" (UniqueName: \"kubernetes.io/projected/9d5b3735-fda7-4348-bd36-ffde74bed4d5-kube-api-access-79hmd\") pod \"cert-manager-858654f9db-p4bzm\" (UID: \"9d5b3735-fda7-4348-bd36-ffde74bed4d5\") " pod="cert-manager/cert-manager-858654f9db-p4bzm" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.143312 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79hmd\" (UniqueName: \"kubernetes.io/projected/9d5b3735-fda7-4348-bd36-ffde74bed4d5-kube-api-access-79hmd\") pod \"cert-manager-858654f9db-p4bzm\" (UID: \"9d5b3735-fda7-4348-bd36-ffde74bed4d5\") " pod="cert-manager/cert-manager-858654f9db-p4bzm" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.143591 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s2bg\" (UniqueName: \"kubernetes.io/projected/73158f7c-619d-4202-ae40-543a797efbf2-kube-api-access-7s2bg\") pod \"cert-manager-cainjector-cf98fcc89-lxkwt\" (UID: \"73158f7c-619d-4202-ae40-543a797efbf2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.225206 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptdg\" (UniqueName: \"kubernetes.io/projected/3412227f-d74c-4cdd-a225-e2eaa1d64ad0-kube-api-access-fptdg\") pod \"cert-manager-webhook-687f57d79b-5jz97\" (UID: \"3412227f-d74c-4cdd-a225-e2eaa1d64ad0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.240732 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-p4bzm" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.244473 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptdg\" (UniqueName: \"kubernetes.io/projected/3412227f-d74c-4cdd-a225-e2eaa1d64ad0-kube-api-access-fptdg\") pod \"cert-manager-webhook-687f57d79b-5jz97\" (UID: \"3412227f-d74c-4cdd-a225-e2eaa1d64ad0\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.250603 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.275898 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.557038 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5jz97"] Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.565168 4880 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.699094 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt"] Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.702434 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-p4bzm"] Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.987266 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-p4bzm" event={"ID":"9d5b3735-fda7-4348-bd36-ffde74bed4d5","Type":"ContainerStarted","Data":"6f3c46f0dca54a5308f314345e9e87a0c1333defd103e6bafe4c01d4d5907d16"} Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.989336 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" event={"ID":"3412227f-d74c-4cdd-a225-e2eaa1d64ad0","Type":"ContainerStarted","Data":"b6ed285a7e3fa591792e87c741e73eac807965a506f9e311a73907d939158eaa"} Feb 18 12:01:59 crc kubenswrapper[4880]: I0218 12:01:59.990405 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" event={"ID":"73158f7c-619d-4202-ae40-543a797efbf2","Type":"ContainerStarted","Data":"e6665daf1327e3a438e92f3e30061efc2d0e9546e58871667cbc7e7f6684338f"} Feb 18 12:02:07 crc kubenswrapper[4880]: I0218 12:02:07.038695 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" event={"ID":"3412227f-d74c-4cdd-a225-e2eaa1d64ad0","Type":"ContainerStarted","Data":"496a85e3c8eb09bf5ec0c642828a53074b2fb82fce4de78b772955be36204661"} Feb 18 12:02:07 crc kubenswrapper[4880]: I0218 12:02:07.039317 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" Feb 18 12:02:07 crc kubenswrapper[4880]: I0218 12:02:07.041023 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-p4bzm" event={"ID":"9d5b3735-fda7-4348-bd36-ffde74bed4d5","Type":"ContainerStarted","Data":"127c1e238b1e8d4bd5859b31648830219609157804bd86c098861e0c37f4baba"} Feb 18 12:02:07 crc kubenswrapper[4880]: I0218 12:02:07.060165 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" podStartSLOduration=2.324551743 podStartE2EDuration="9.060145439s" podCreationTimestamp="2026-02-18 12:01:58 +0000 UTC" firstStartedPulling="2026-02-18 12:01:59.564903411 +0000 UTC m=+626.993804282" lastFinishedPulling="2026-02-18 12:02:06.300497117 +0000 UTC m=+633.729397978" observedRunningTime="2026-02-18 12:02:07.057139836 +0000 UTC m=+634.486040707" watchObservedRunningTime="2026-02-18 12:02:07.060145439 +0000 UTC m=+634.489046300" Feb 18 12:02:07 crc kubenswrapper[4880]: I0218 12:02:07.074930 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-p4bzm" podStartSLOduration=2.5934827289999998 podStartE2EDuration="9.074883368s" podCreationTimestamp="2026-02-18 12:01:58 +0000 UTC" firstStartedPulling="2026-02-18 12:01:59.707430064 +0000 UTC m=+627.136330935" lastFinishedPulling="2026-02-18 12:02:06.188830703 +0000 UTC m=+633.617731574" observedRunningTime="2026-02-18 12:02:07.07026742 +0000 UTC m=+634.499168281" watchObservedRunningTime="2026-02-18 12:02:07.074883368 +0000 UTC m=+634.503784229" Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.047727 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" event={"ID":"73158f7c-619d-4202-ae40-543a797efbf2","Type":"ContainerStarted","Data":"c2185f48b9b229adfaf415a9ac1cc185f1d26dbc9e5324ca7002d7787497bf3c"} Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.065740 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lxkwt" podStartSLOduration=2.118825471 podStartE2EDuration="10.065722037s" podCreationTimestamp="2026-02-18 12:01:58 +0000 UTC" firstStartedPulling="2026-02-18 12:01:59.709289185 +0000 UTC m=+627.138190056" lastFinishedPulling="2026-02-18 12:02:07.656185761 +0000 UTC m=+635.085086622" observedRunningTime="2026-02-18 12:02:08.061762777 +0000 UTC m=+635.490663658" watchObservedRunningTime="2026-02-18 12:02:08.065722037 +0000 UTC m=+635.494622898" Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.836474 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6jxd5"] Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.837392 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-controller" containerID="cri-o://bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3" gracePeriod=30 Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.837546 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="northd" containerID="cri-o://09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585" gracePeriod=30 Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.837592 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4" gracePeriod=30 Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.837732 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-acl-logging" containerID="cri-o://9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9" gracePeriod=30 Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.837759 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="sbdb" containerID="cri-o://9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b" gracePeriod=30 Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.837922 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-node" containerID="cri-o://c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616" gracePeriod=30 Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.840333 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="nbdb" containerID="cri-o://c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad" gracePeriod=30 Feb 18 12:02:08 crc kubenswrapper[4880]: I0218 12:02:08.884340 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" containerID="cri-o://b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c" gracePeriod=30 Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.058013 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/2.log" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.058548 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/1.log" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.058597 4880 generic.go:334] "Generic (PLEG): container finished" podID="3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8" containerID="e257872bff6e653d23a87b41f3b00970844c0ededb3938761df49207c05f4ef6" exitCode=2 Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.058748 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerDied","Data":"e257872bff6e653d23a87b41f3b00970844c0ededb3938761df49207c05f4ef6"} Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.058791 4880 scope.go:117] "RemoveContainer" containerID="6ddf0af2fadd4cd3cd5dea91a3c2576f790bde3a724701b684765ecf5618ea36" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.059179 4880 scope.go:117] "RemoveContainer" containerID="e257872bff6e653d23a87b41f3b00970844c0ededb3938761df49207c05f4ef6" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.059361 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mh8wn_openshift-multus(3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8)\"" pod="openshift-multus/multus-mh8wn" podUID="3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.063727 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovnkube-controller/3.log" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.065918 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovn-acl-logging/0.log" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067222 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovn-controller/0.log" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067679 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c" exitCode=0 Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067700 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4" exitCode=0 Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067708 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616" exitCode=0 Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067714 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9" exitCode=143 Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067722 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3" exitCode=143 Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067795 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c"} Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067874 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4"} Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067892 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616"} Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067904 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9"} Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.067917 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3"} Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.161412 4880 scope.go:117] "RemoveContainer" containerID="fd2cfc6b63e096520b355c8f901e92d00aba35370d7315533b61cde160a90881" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.649971 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovn-acl-logging/0.log" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.650508 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovn-controller/0.log" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.650918 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.702955 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gxsfb"] Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.703404 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.703471 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.703526 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.703579 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.703657 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kubecfg-setup" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.703717 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kubecfg-setup" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.703775 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="sbdb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.703822 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="sbdb" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.703874 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-node" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.703924 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-node" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.703979 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="northd" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704024 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="northd" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.704072 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="nbdb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704117 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="nbdb" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.704167 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704218 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.704269 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704316 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.704367 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704416 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.704471 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-acl-logging" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704521 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-acl-logging" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704727 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="sbdb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704828 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704888 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="northd" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.704946 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705003 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705064 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="nbdb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705114 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovn-acl-logging" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705166 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705242 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705296 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705348 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="kube-rbac-proxy-node" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.705491 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705549 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: E0218 12:02:09.705654 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705741 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.705881 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerName="ovnkube-controller" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.707490 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767458 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-node-log\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767514 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-slash\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767550 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-script-lib\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767551 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-node-log" (OuterVolumeSpecName: "node-log") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767579 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-log-socket\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767664 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-log-socket" (OuterVolumeSpecName: "log-socket") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767686 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767699 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-slash" (OuterVolumeSpecName: "host-slash") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767745 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-ovn\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767774 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-kubelet\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767799 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767806 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-netns\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767821 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767840 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzfkz\" (UniqueName: \"kubernetes.io/projected/0ff5dd18-cbe8-4a79-9518-9786a3521131-kube-api-access-xzfkz\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767832 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767875 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-netd\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767899 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767914 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-openvswitch\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767935 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767942 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovn-node-metrics-cert\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767962 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.767982 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-var-lib-openvswitch\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768020 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-systemd-units\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768043 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-env-overrides\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768060 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768078 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-ovn-kubernetes\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768116 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-config\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768144 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-bin\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768180 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-systemd\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768205 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-etc-openvswitch\") pod \"0ff5dd18-cbe8-4a79-9518-9786a3521131\" (UID: \"0ff5dd18-cbe8-4a79-9518-9786a3521131\") " Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768079 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768395 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768418 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768437 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768498 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768567 4880 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768582 4880 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768593 4880 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768605 4880 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768637 4880 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768657 4880 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-node-log\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768668 4880 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-slash\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768680 4880 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-log-socket\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768691 4880 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768703 4880 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768715 4880 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768725 4880 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768737 4880 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768748 4880 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768761 4880 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.768815 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.769017 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.773628 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff5dd18-cbe8-4a79-9518-9786a3521131-kube-api-access-xzfkz" (OuterVolumeSpecName: "kube-api-access-xzfkz") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "kube-api-access-xzfkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.773978 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.783514 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0ff5dd18-cbe8-4a79-9518-9786a3521131" (UID: "0ff5dd18-cbe8-4a79-9518-9786a3521131"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.870979 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-run-netns\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871030 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871045 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-cni-bin\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871062 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovnkube-config\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871083 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-systemd\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871165 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-cni-netd\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871244 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovnkube-script-lib\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871305 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-kubelet\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871319 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-var-lib-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871342 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871361 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-etc-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871410 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-ovn\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871527 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-slash\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871551 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbzh2\" (UniqueName: \"kubernetes.io/projected/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-kube-api-access-tbzh2\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871580 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-env-overrides\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871636 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-systemd-units\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871668 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovn-node-metrics-cert\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871737 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-node-log\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871772 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-log-socket\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871815 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871890 4880 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871911 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzfkz\" (UniqueName: \"kubernetes.io/projected/0ff5dd18-cbe8-4a79-9518-9786a3521131-kube-api-access-xzfkz\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871925 4880 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871938 4880 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ff5dd18-cbe8-4a79-9518-9786a3521131-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.871947 4880 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ff5dd18-cbe8-4a79-9518-9786a3521131-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.972931 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-run-netns\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.972983 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973002 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-cni-bin\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973026 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovnkube-config\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973050 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-systemd\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973064 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-cni-netd\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973084 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovnkube-script-lib\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973116 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-run-netns\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973149 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-cni-bin\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973193 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-systemd\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973236 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973179 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-kubelet\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973140 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-kubelet\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973248 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-cni-netd\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973294 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-var-lib-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973328 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973387 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-etc-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973416 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-ovn\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973445 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973410 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-var-lib-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973490 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-etc-openvswitch\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973588 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-slash\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973648 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbzh2\" (UniqueName: \"kubernetes.io/projected/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-kube-api-access-tbzh2\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973696 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-env-overrides\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973724 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-systemd-units\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973760 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovn-node-metrics-cert\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973787 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-node-log\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973818 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-log-socket\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973841 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973889 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovnkube-script-lib\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973930 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-run-ovn\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973961 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovnkube-config\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973980 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-systemd-units\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.973965 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-node-log\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.974020 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-log-socket\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.974049 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.974075 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-host-slash\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.974473 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-env-overrides\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.977123 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-ovn-node-metrics-cert\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:09 crc kubenswrapper[4880]: I0218 12:02:09.989607 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbzh2\" (UniqueName: \"kubernetes.io/projected/05eb7c18-4ad0-46c0-91c9-093ebb4efd56-kube-api-access-tbzh2\") pod \"ovnkube-node-gxsfb\" (UID: \"05eb7c18-4ad0-46c0-91c9-093ebb4efd56\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.022121 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:10 crc kubenswrapper[4880]: W0218 12:02:10.045372 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05eb7c18_4ad0_46c0_91c9_093ebb4efd56.slice/crio-7788751cddc95b5e76b931753c9031cdcce4f8949fe40e6a32a2dec1afef5739 WatchSource:0}: Error finding container 7788751cddc95b5e76b931753c9031cdcce4f8949fe40e6a32a2dec1afef5739: Status 404 returned error can't find the container with id 7788751cddc95b5e76b931753c9031cdcce4f8949fe40e6a32a2dec1afef5739 Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.074466 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"7788751cddc95b5e76b931753c9031cdcce4f8949fe40e6a32a2dec1afef5739"} Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.076539 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/2.log" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.082195 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovn-acl-logging/0.log" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083059 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jxd5_0ff5dd18-cbe8-4a79-9518-9786a3521131/ovn-controller/0.log" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083466 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b" exitCode=0 Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083490 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad" exitCode=0 Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083498 4880 generic.go:334] "Generic (PLEG): container finished" podID="0ff5dd18-cbe8-4a79-9518-9786a3521131" containerID="09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585" exitCode=0 Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083527 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b"} Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083587 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad"} Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083598 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585"} Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083607 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" event={"ID":"0ff5dd18-cbe8-4a79-9518-9786a3521131","Type":"ContainerDied","Data":"e1534cddb14cd97a3849504461d496f35a9380e39fee3d7d2767f1799fad0326"} Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083635 4880 scope.go:117] "RemoveContainer" containerID="b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.083661 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jxd5" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.119216 4880 scope.go:117] "RemoveContainer" containerID="9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.124885 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6jxd5"] Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.131807 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6jxd5"] Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.135763 4880 scope.go:117] "RemoveContainer" containerID="c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.148744 4880 scope.go:117] "RemoveContainer" containerID="09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.163468 4880 scope.go:117] "RemoveContainer" containerID="7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.180949 4880 scope.go:117] "RemoveContainer" containerID="c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.195711 4880 scope.go:117] "RemoveContainer" containerID="9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.211828 4880 scope.go:117] "RemoveContainer" containerID="bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.280414 4880 scope.go:117] "RemoveContainer" containerID="8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.295535 4880 scope.go:117] "RemoveContainer" containerID="b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.297683 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c\": container with ID starting with b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c not found: ID does not exist" containerID="b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.297729 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c"} err="failed to get container status \"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c\": rpc error: code = NotFound desc = could not find container \"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c\": container with ID starting with b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.297757 4880 scope.go:117] "RemoveContainer" containerID="9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.298133 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\": container with ID starting with 9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b not found: ID does not exist" containerID="9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.298249 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b"} err="failed to get container status \"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\": rpc error: code = NotFound desc = could not find container \"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\": container with ID starting with 9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.298347 4880 scope.go:117] "RemoveContainer" containerID="c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.299486 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\": container with ID starting with c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad not found: ID does not exist" containerID="c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.299507 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad"} err="failed to get container status \"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\": rpc error: code = NotFound desc = could not find container \"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\": container with ID starting with c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.299521 4880 scope.go:117] "RemoveContainer" containerID="09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.299842 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\": container with ID starting with 09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585 not found: ID does not exist" containerID="09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.299866 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585"} err="failed to get container status \"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\": rpc error: code = NotFound desc = could not find container \"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\": container with ID starting with 09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.299882 4880 scope.go:117] "RemoveContainer" containerID="7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.300179 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\": container with ID starting with 7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4 not found: ID does not exist" containerID="7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.300261 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4"} err="failed to get container status \"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\": rpc error: code = NotFound desc = could not find container \"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\": container with ID starting with 7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.300325 4880 scope.go:117] "RemoveContainer" containerID="c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.300688 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\": container with ID starting with c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616 not found: ID does not exist" containerID="c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.300819 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616"} err="failed to get container status \"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\": rpc error: code = NotFound desc = could not find container \"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\": container with ID starting with c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.300902 4880 scope.go:117] "RemoveContainer" containerID="9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.301157 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\": container with ID starting with 9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9 not found: ID does not exist" containerID="9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.301192 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9"} err="failed to get container status \"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\": rpc error: code = NotFound desc = could not find container \"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\": container with ID starting with 9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.301232 4880 scope.go:117] "RemoveContainer" containerID="bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.301511 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\": container with ID starting with bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3 not found: ID does not exist" containerID="bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.301677 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3"} err="failed to get container status \"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\": rpc error: code = NotFound desc = could not find container \"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\": container with ID starting with bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.301810 4880 scope.go:117] "RemoveContainer" containerID="8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888" Feb 18 12:02:10 crc kubenswrapper[4880]: E0218 12:02:10.302150 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\": container with ID starting with 8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888 not found: ID does not exist" containerID="8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.302249 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888"} err="failed to get container status \"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\": rpc error: code = NotFound desc = could not find container \"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\": container with ID starting with 8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.302322 4880 scope.go:117] "RemoveContainer" containerID="b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.302538 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c"} err="failed to get container status \"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c\": rpc error: code = NotFound desc = could not find container \"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c\": container with ID starting with b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.302651 4880 scope.go:117] "RemoveContainer" containerID="9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.302938 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b"} err="failed to get container status \"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\": rpc error: code = NotFound desc = could not find container \"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\": container with ID starting with 9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.303030 4880 scope.go:117] "RemoveContainer" containerID="c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.303272 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad"} err="failed to get container status \"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\": rpc error: code = NotFound desc = could not find container \"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\": container with ID starting with c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.303358 4880 scope.go:117] "RemoveContainer" containerID="09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.303654 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585"} err="failed to get container status \"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\": rpc error: code = NotFound desc = could not find container \"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\": container with ID starting with 09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.303746 4880 scope.go:117] "RemoveContainer" containerID="7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.304019 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4"} err="failed to get container status \"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\": rpc error: code = NotFound desc = could not find container \"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\": container with ID starting with 7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.304123 4880 scope.go:117] "RemoveContainer" containerID="c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.304370 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616"} err="failed to get container status \"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\": rpc error: code = NotFound desc = could not find container \"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\": container with ID starting with c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.304443 4880 scope.go:117] "RemoveContainer" containerID="9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.304722 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9"} err="failed to get container status \"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\": rpc error: code = NotFound desc = could not find container \"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\": container with ID starting with 9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.304818 4880 scope.go:117] "RemoveContainer" containerID="bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.305033 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3"} err="failed to get container status \"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\": rpc error: code = NotFound desc = could not find container \"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\": container with ID starting with bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.305108 4880 scope.go:117] "RemoveContainer" containerID="8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.305372 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888"} err="failed to get container status \"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\": rpc error: code = NotFound desc = could not find container \"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\": container with ID starting with 8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.305446 4880 scope.go:117] "RemoveContainer" containerID="b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.305687 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c"} err="failed to get container status \"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c\": rpc error: code = NotFound desc = could not find container \"b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c\": container with ID starting with b126edf5371d8ef7206d3656dbb7eb5f82cca42ff2435db2deabb6772a0b841c not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.305788 4880 scope.go:117] "RemoveContainer" containerID="9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.306017 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b"} err="failed to get container status \"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\": rpc error: code = NotFound desc = could not find container \"9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b\": container with ID starting with 9556e39f9131d2289dc4ba1f314ee38c9a9b6791d5bdf6a26dff77aec97d472b not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.306095 4880 scope.go:117] "RemoveContainer" containerID="c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.306371 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad"} err="failed to get container status \"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\": rpc error: code = NotFound desc = could not find container \"c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad\": container with ID starting with c791ba68d2305880a9b26ab590da89a5858947edf8aba1be996b50cc70703cad not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.306448 4880 scope.go:117] "RemoveContainer" containerID="09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.306702 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585"} err="failed to get container status \"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\": rpc error: code = NotFound desc = could not find container \"09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585\": container with ID starting with 09f136660dccba1032c56c527295874bebb17c822b779354d2eafeb1014b9585 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.306788 4880 scope.go:117] "RemoveContainer" containerID="7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.307010 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4"} err="failed to get container status \"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\": rpc error: code = NotFound desc = could not find container \"7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4\": container with ID starting with 7c5ef4200027d2be2a92c7bbe8368466c9ca0bbd1234a17c8eeeef12008ad8b4 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.307090 4880 scope.go:117] "RemoveContainer" containerID="c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.307325 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616"} err="failed to get container status \"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\": rpc error: code = NotFound desc = could not find container \"c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616\": container with ID starting with c171e145d87e33ac41daa20b43fd9dc05d754f6286d7d071f9cd3e1bff38a616 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.307410 4880 scope.go:117] "RemoveContainer" containerID="9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.307708 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9"} err="failed to get container status \"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\": rpc error: code = NotFound desc = could not find container \"9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9\": container with ID starting with 9d6b15f2cf3daa82613c33e4086eac7448bd1d2d98dcb92f42c1ba5fe1b3e6d9 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.307791 4880 scope.go:117] "RemoveContainer" containerID="bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.307983 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3"} err="failed to get container status \"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\": rpc error: code = NotFound desc = could not find container \"bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3\": container with ID starting with bbe817158d6b19b45128e8c78c99df99a7a25893edc54bb53fd5ef6d4e4213d3 not found: ID does not exist" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.308104 4880 scope.go:117] "RemoveContainer" containerID="8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888" Feb 18 12:02:10 crc kubenswrapper[4880]: I0218 12:02:10.308341 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888"} err="failed to get container status \"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\": rpc error: code = NotFound desc = could not find container \"8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888\": container with ID starting with 8f99c4f46556cc7e1b7a170cc9a1a740d5abcf4ed0d117d3f311a6fbd668a888 not found: ID does not exist" Feb 18 12:02:11 crc kubenswrapper[4880]: I0218 12:02:11.090307 4880 generic.go:334] "Generic (PLEG): container finished" podID="05eb7c18-4ad0-46c0-91c9-093ebb4efd56" containerID="aab18940b64422680a20d0f093fcfa2a58b182094b3e37f885013e9ea1402dbd" exitCode=0 Feb 18 12:02:11 crc kubenswrapper[4880]: I0218 12:02:11.090351 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerDied","Data":"aab18940b64422680a20d0f093fcfa2a58b182094b3e37f885013e9ea1402dbd"} Feb 18 12:02:11 crc kubenswrapper[4880]: I0218 12:02:11.187788 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff5dd18-cbe8-4a79-9518-9786a3521131" path="/var/lib/kubelet/pods/0ff5dd18-cbe8-4a79-9518-9786a3521131/volumes" Feb 18 12:02:12 crc kubenswrapper[4880]: I0218 12:02:12.098653 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"890c96b285ad704956eb9b222e5ff5a93fda170912e821ab3a8e451ab1ef6094"} Feb 18 12:02:12 crc kubenswrapper[4880]: I0218 12:02:12.098935 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"c9b2a3a51620541519723d3170445a761186ec256e12d133249c4bc425bbb288"} Feb 18 12:02:12 crc kubenswrapper[4880]: I0218 12:02:12.098946 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"118cb17b73cb730c7de5c9ab948e0088c30086952ddadb82a1d387f42e2df48b"} Feb 18 12:02:12 crc kubenswrapper[4880]: I0218 12:02:12.098954 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"aa2e5c1e88ccb0b12ccfdf2baa17e6065c6d95b5f68d321c97c30d9ef6a455fe"} Feb 18 12:02:13 crc kubenswrapper[4880]: I0218 12:02:13.109956 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"a1ca2849d935e183c56be0a2db27614e19451dc4d7980425d82c89814f00bbb7"} Feb 18 12:02:13 crc kubenswrapper[4880]: I0218 12:02:13.110013 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"e4c3ae0b3e93e94aee3782171ffa4fbb38e4576ad3353dd28bd072e6ead753ab"} Feb 18 12:02:14 crc kubenswrapper[4880]: I0218 12:02:14.279842 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-5jz97" Feb 18 12:02:15 crc kubenswrapper[4880]: I0218 12:02:15.125206 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"a54f5a4efb94dd964f2f911e96fbde3cbf00ab343ba343ca1af6bd8af886a7ce"} Feb 18 12:02:17 crc kubenswrapper[4880]: I0218 12:02:17.142447 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" event={"ID":"05eb7c18-4ad0-46c0-91c9-093ebb4efd56","Type":"ContainerStarted","Data":"286e0f9fdb68590f4ba89a07c8d9e9b9e6c9b3854713a5178af0e95ecfb42d71"} Feb 18 12:02:17 crc kubenswrapper[4880]: I0218 12:02:17.145309 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:17 crc kubenswrapper[4880]: I0218 12:02:17.145385 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:17 crc kubenswrapper[4880]: I0218 12:02:17.145418 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:17 crc kubenswrapper[4880]: I0218 12:02:17.175681 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:17 crc kubenswrapper[4880]: I0218 12:02:17.184394 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" podStartSLOduration=8.184378502 podStartE2EDuration="8.184378502s" podCreationTimestamp="2026-02-18 12:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:02:17.180456173 +0000 UTC m=+644.609357034" watchObservedRunningTime="2026-02-18 12:02:17.184378502 +0000 UTC m=+644.613279353" Feb 18 12:02:17 crc kubenswrapper[4880]: I0218 12:02:17.191827 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:22 crc kubenswrapper[4880]: I0218 12:02:22.179327 4880 scope.go:117] "RemoveContainer" containerID="e257872bff6e653d23a87b41f3b00970844c0ededb3938761df49207c05f4ef6" Feb 18 12:02:22 crc kubenswrapper[4880]: E0218 12:02:22.180331 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mh8wn_openshift-multus(3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8)\"" pod="openshift-multus/multus-mh8wn" podUID="3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8" Feb 18 12:02:37 crc kubenswrapper[4880]: I0218 12:02:37.179601 4880 scope.go:117] "RemoveContainer" containerID="e257872bff6e653d23a87b41f3b00970844c0ededb3938761df49207c05f4ef6" Feb 18 12:02:38 crc kubenswrapper[4880]: I0218 12:02:38.266008 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mh8wn_3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8/kube-multus/2.log" Feb 18 12:02:38 crc kubenswrapper[4880]: I0218 12:02:38.266365 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mh8wn" event={"ID":"3f5ec2a3-906d-4ce2-88b1-ca3e3afd18f8","Type":"ContainerStarted","Data":"271c9d1ce4f777de31989e9f8ee66d959a2b6c557323480d1ad81306808ad8b3"} Feb 18 12:02:40 crc kubenswrapper[4880]: I0218 12:02:40.041578 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxsfb" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.631178 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl"] Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.632589 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.635507 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.644749 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl"] Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.753388 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.753456 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.753523 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj2md\" (UniqueName: \"kubernetes.io/projected/6937b296-4e20-44b0-ab02-f16a49ac827e-kube-api-access-vj2md\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.854983 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj2md\" (UniqueName: \"kubernetes.io/projected/6937b296-4e20-44b0-ab02-f16a49ac827e-kube-api-access-vj2md\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.855053 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.855087 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.855964 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.856308 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.877489 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj2md\" (UniqueName: \"kubernetes.io/projected/6937b296-4e20-44b0-ab02-f16a49ac827e-kube-api-access-vj2md\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:45 crc kubenswrapper[4880]: I0218 12:02:45.956902 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:46 crc kubenswrapper[4880]: I0218 12:02:46.157157 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl"] Feb 18 12:02:46 crc kubenswrapper[4880]: I0218 12:02:46.311684 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" event={"ID":"6937b296-4e20-44b0-ab02-f16a49ac827e","Type":"ContainerStarted","Data":"0f5517ca3f0e2c78caecb4c5eb9717e93300c414bb4cfbc439bf069772ce69bd"} Feb 18 12:02:47 crc kubenswrapper[4880]: I0218 12:02:47.318135 4880 generic.go:334] "Generic (PLEG): container finished" podID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerID="3d8f6db83a1d281939fc47bf6b4d813df3c0eeb8c15a2257aefa3891d592a382" exitCode=0 Feb 18 12:02:47 crc kubenswrapper[4880]: I0218 12:02:47.318199 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" event={"ID":"6937b296-4e20-44b0-ab02-f16a49ac827e","Type":"ContainerDied","Data":"3d8f6db83a1d281939fc47bf6b4d813df3c0eeb8c15a2257aefa3891d592a382"} Feb 18 12:02:50 crc kubenswrapper[4880]: I0218 12:02:50.334768 4880 generic.go:334] "Generic (PLEG): container finished" podID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerID="bd171c096bc5a271a4f73e061d20ccdbc098294718fb8f000acb3bd76be17634" exitCode=0 Feb 18 12:02:50 crc kubenswrapper[4880]: I0218 12:02:50.334848 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" event={"ID":"6937b296-4e20-44b0-ab02-f16a49ac827e","Type":"ContainerDied","Data":"bd171c096bc5a271a4f73e061d20ccdbc098294718fb8f000acb3bd76be17634"} Feb 18 12:02:51 crc kubenswrapper[4880]: I0218 12:02:51.341960 4880 generic.go:334] "Generic (PLEG): container finished" podID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerID="53bfaa56ee535dbc3c4ae11a6c8cfaf6eff7c9d5b208dac45cf9439e5fdcb979" exitCode=0 Feb 18 12:02:51 crc kubenswrapper[4880]: I0218 12:02:51.342009 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" event={"ID":"6937b296-4e20-44b0-ab02-f16a49ac827e","Type":"ContainerDied","Data":"53bfaa56ee535dbc3c4ae11a6c8cfaf6eff7c9d5b208dac45cf9439e5fdcb979"} Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.613408 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.747281 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj2md\" (UniqueName: \"kubernetes.io/projected/6937b296-4e20-44b0-ab02-f16a49ac827e-kube-api-access-vj2md\") pod \"6937b296-4e20-44b0-ab02-f16a49ac827e\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.747396 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-util\") pod \"6937b296-4e20-44b0-ab02-f16a49ac827e\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.747509 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-bundle\") pod \"6937b296-4e20-44b0-ab02-f16a49ac827e\" (UID: \"6937b296-4e20-44b0-ab02-f16a49ac827e\") " Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.749345 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-bundle" (OuterVolumeSpecName: "bundle") pod "6937b296-4e20-44b0-ab02-f16a49ac827e" (UID: "6937b296-4e20-44b0-ab02-f16a49ac827e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.753160 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6937b296-4e20-44b0-ab02-f16a49ac827e-kube-api-access-vj2md" (OuterVolumeSpecName: "kube-api-access-vj2md") pod "6937b296-4e20-44b0-ab02-f16a49ac827e" (UID: "6937b296-4e20-44b0-ab02-f16a49ac827e"). InnerVolumeSpecName "kube-api-access-vj2md". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.758533 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-util" (OuterVolumeSpecName: "util") pod "6937b296-4e20-44b0-ab02-f16a49ac827e" (UID: "6937b296-4e20-44b0-ab02-f16a49ac827e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.848935 4880 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.848980 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj2md\" (UniqueName: \"kubernetes.io/projected/6937b296-4e20-44b0-ab02-f16a49ac827e-kube-api-access-vj2md\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:52 crc kubenswrapper[4880]: I0218 12:02:52.848994 4880 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6937b296-4e20-44b0-ab02-f16a49ac827e-util\") on node \"crc\" DevicePath \"\"" Feb 18 12:02:53 crc kubenswrapper[4880]: I0218 12:02:53.353190 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" event={"ID":"6937b296-4e20-44b0-ab02-f16a49ac827e","Type":"ContainerDied","Data":"0f5517ca3f0e2c78caecb4c5eb9717e93300c414bb4cfbc439bf069772ce69bd"} Feb 18 12:02:53 crc kubenswrapper[4880]: I0218 12:02:53.353484 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f5517ca3f0e2c78caecb4c5eb9717e93300c414bb4cfbc439bf069772ce69bd" Feb 18 12:02:53 crc kubenswrapper[4880]: I0218 12:02:53.353244 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.565178 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4"] Feb 18 12:03:02 crc kubenswrapper[4880]: E0218 12:03:02.566116 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerName="extract" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.566130 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerName="extract" Feb 18 12:03:02 crc kubenswrapper[4880]: E0218 12:03:02.566156 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerName="util" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.566162 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerName="util" Feb 18 12:03:02 crc kubenswrapper[4880]: E0218 12:03:02.566170 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerName="pull" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.566177 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerName="pull" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.566264 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="6937b296-4e20-44b0-ab02-f16a49ac827e" containerName="extract" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.566697 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.570232 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.570561 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.577586 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4"] Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.579357 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-nhpgx" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.669344 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47n8n\" (UniqueName: \"kubernetes.io/projected/44a09229-68d6-48b2-99c3-cd3d3d9d2d9e-kube-api-access-47n8n\") pod \"obo-prometheus-operator-68bc856cb9-csbj4\" (UID: \"44a09229-68d6-48b2-99c3-cd3d3d9d2d9e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.683644 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8"] Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.684343 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.687111 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.688529 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-g9qr5" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.709011 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf"] Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.709987 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.714927 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8"] Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.737188 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf"] Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.770385 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/043f42bf-3e01-42a2-87b4-999205377d66-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-m55w8\" (UID: \"043f42bf-3e01-42a2-87b4-999205377d66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.770474 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47n8n\" (UniqueName: \"kubernetes.io/projected/44a09229-68d6-48b2-99c3-cd3d3d9d2d9e-kube-api-access-47n8n\") pod \"obo-prometheus-operator-68bc856cb9-csbj4\" (UID: \"44a09229-68d6-48b2-99c3-cd3d3d9d2d9e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.770511 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2924423e-1b79-4c15-b9b2-cb0d1619ad5c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf\" (UID: \"2924423e-1b79-4c15-b9b2-cb0d1619ad5c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.770546 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2924423e-1b79-4c15-b9b2-cb0d1619ad5c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf\" (UID: \"2924423e-1b79-4c15-b9b2-cb0d1619ad5c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.770591 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/043f42bf-3e01-42a2-87b4-999205377d66-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-m55w8\" (UID: \"043f42bf-3e01-42a2-87b4-999205377d66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.789267 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47n8n\" (UniqueName: \"kubernetes.io/projected/44a09229-68d6-48b2-99c3-cd3d3d9d2d9e-kube-api-access-47n8n\") pod \"obo-prometheus-operator-68bc856cb9-csbj4\" (UID: \"44a09229-68d6-48b2-99c3-cd3d3d9d2d9e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.871105 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/043f42bf-3e01-42a2-87b4-999205377d66-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-m55w8\" (UID: \"043f42bf-3e01-42a2-87b4-999205377d66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.871430 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2924423e-1b79-4c15-b9b2-cb0d1619ad5c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf\" (UID: \"2924423e-1b79-4c15-b9b2-cb0d1619ad5c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.871546 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2924423e-1b79-4c15-b9b2-cb0d1619ad5c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf\" (UID: \"2924423e-1b79-4c15-b9b2-cb0d1619ad5c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.871649 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/043f42bf-3e01-42a2-87b4-999205377d66-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-m55w8\" (UID: \"043f42bf-3e01-42a2-87b4-999205377d66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.875023 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2924423e-1b79-4c15-b9b2-cb0d1619ad5c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf\" (UID: \"2924423e-1b79-4c15-b9b2-cb0d1619ad5c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.875458 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/043f42bf-3e01-42a2-87b4-999205377d66-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-m55w8\" (UID: \"043f42bf-3e01-42a2-87b4-999205377d66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.875487 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/043f42bf-3e01-42a2-87b4-999205377d66-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-m55w8\" (UID: \"043f42bf-3e01-42a2-87b4-999205377d66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.876922 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2924423e-1b79-4c15-b9b2-cb0d1619ad5c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf\" (UID: \"2924423e-1b79-4c15-b9b2-cb0d1619ad5c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.885636 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hqkfn"] Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.886345 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.888669 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-w7vzw" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.888905 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.889235 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.905868 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hqkfn"] Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.972094 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/478db717-fcd1-4a34-a2ca-c98b0bda52f7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hqkfn\" (UID: \"478db717-fcd1-4a34-a2ca-c98b0bda52f7\") " pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:02 crc kubenswrapper[4880]: I0218 12:03:02.972206 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwlmc\" (UniqueName: \"kubernetes.io/projected/478db717-fcd1-4a34-a2ca-c98b0bda52f7-kube-api-access-bwlmc\") pod \"observability-operator-59bdc8b94-hqkfn\" (UID: \"478db717-fcd1-4a34-a2ca-c98b0bda52f7\") " pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.001592 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.020771 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-458b7"] Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.022033 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.023718 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-tk7zk" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.032689 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.042918 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-458b7"] Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.074575 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwlmc\" (UniqueName: \"kubernetes.io/projected/478db717-fcd1-4a34-a2ca-c98b0bda52f7-kube-api-access-bwlmc\") pod \"observability-operator-59bdc8b94-hqkfn\" (UID: \"478db717-fcd1-4a34-a2ca-c98b0bda52f7\") " pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.074726 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/478db717-fcd1-4a34-a2ca-c98b0bda52f7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hqkfn\" (UID: \"478db717-fcd1-4a34-a2ca-c98b0bda52f7\") " pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.079550 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/478db717-fcd1-4a34-a2ca-c98b0bda52f7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hqkfn\" (UID: \"478db717-fcd1-4a34-a2ca-c98b0bda52f7\") " pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.094727 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwlmc\" (UniqueName: \"kubernetes.io/projected/478db717-fcd1-4a34-a2ca-c98b0bda52f7-kube-api-access-bwlmc\") pod \"observability-operator-59bdc8b94-hqkfn\" (UID: \"478db717-fcd1-4a34-a2ca-c98b0bda52f7\") " pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.160994 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4"] Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.176238 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba8852c5-a3ee-43ad-b559-14018ecccf33-openshift-service-ca\") pod \"perses-operator-5bf474d74f-458b7\" (UID: \"ba8852c5-a3ee-43ad-b559-14018ecccf33\") " pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.176282 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77rvk\" (UniqueName: \"kubernetes.io/projected/ba8852c5-a3ee-43ad-b559-14018ecccf33-kube-api-access-77rvk\") pod \"perses-operator-5bf474d74f-458b7\" (UID: \"ba8852c5-a3ee-43ad-b559-14018ecccf33\") " pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.262547 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.277469 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba8852c5-a3ee-43ad-b559-14018ecccf33-openshift-service-ca\") pod \"perses-operator-5bf474d74f-458b7\" (UID: \"ba8852c5-a3ee-43ad-b559-14018ecccf33\") " pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.277527 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77rvk\" (UniqueName: \"kubernetes.io/projected/ba8852c5-a3ee-43ad-b559-14018ecccf33-kube-api-access-77rvk\") pod \"perses-operator-5bf474d74f-458b7\" (UID: \"ba8852c5-a3ee-43ad-b559-14018ecccf33\") " pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.279221 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba8852c5-a3ee-43ad-b559-14018ecccf33-openshift-service-ca\") pod \"perses-operator-5bf474d74f-458b7\" (UID: \"ba8852c5-a3ee-43ad-b559-14018ecccf33\") " pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.306012 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77rvk\" (UniqueName: \"kubernetes.io/projected/ba8852c5-a3ee-43ad-b559-14018ecccf33-kube-api-access-77rvk\") pod \"perses-operator-5bf474d74f-458b7\" (UID: \"ba8852c5-a3ee-43ad-b559-14018ecccf33\") " pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.331377 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf"] Feb 18 12:03:03 crc kubenswrapper[4880]: W0218 12:03:03.340124 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2924423e_1b79_4c15_b9b2_cb0d1619ad5c.slice/crio-a20ca9db11812c230b278ab3ca7e5f72ff674300d3e810ffe1afe208846b08dd WatchSource:0}: Error finding container a20ca9db11812c230b278ab3ca7e5f72ff674300d3e810ffe1afe208846b08dd: Status 404 returned error can't find the container with id a20ca9db11812c230b278ab3ca7e5f72ff674300d3e810ffe1afe208846b08dd Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.355996 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.403433 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" event={"ID":"44a09229-68d6-48b2-99c3-cd3d3d9d2d9e","Type":"ContainerStarted","Data":"e8a80d78481f846a9cc1e8be8208c93e4f774893d3f19d7b6409bbefc50648fc"} Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.412510 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" event={"ID":"2924423e-1b79-4c15-b9b2-cb0d1619ad5c","Type":"ContainerStarted","Data":"a20ca9db11812c230b278ab3ca7e5f72ff674300d3e810ffe1afe208846b08dd"} Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.587233 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8"] Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.626266 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hqkfn"] Feb 18 12:03:03 crc kubenswrapper[4880]: I0218 12:03:03.683351 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-458b7"] Feb 18 12:03:03 crc kubenswrapper[4880]: W0218 12:03:03.685811 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba8852c5_a3ee_43ad_b559_14018ecccf33.slice/crio-eafe3d3e4461fcbd5d2e7eb1eae66154509ce9555a7843660cc07694924731c5 WatchSource:0}: Error finding container eafe3d3e4461fcbd5d2e7eb1eae66154509ce9555a7843660cc07694924731c5: Status 404 returned error can't find the container with id eafe3d3e4461fcbd5d2e7eb1eae66154509ce9555a7843660cc07694924731c5 Feb 18 12:03:04 crc kubenswrapper[4880]: I0218 12:03:04.469499 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" event={"ID":"478db717-fcd1-4a34-a2ca-c98b0bda52f7","Type":"ContainerStarted","Data":"719168df3655915ea1e67d5a64215f14bbee368a6ece02463a017276b824fa21"} Feb 18 12:03:04 crc kubenswrapper[4880]: I0218 12:03:04.473011 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" event={"ID":"043f42bf-3e01-42a2-87b4-999205377d66","Type":"ContainerStarted","Data":"3bca806b511b6e89204ecaf9be8ae8bbb14acde81c727d9aad85b6763513ef83"} Feb 18 12:03:04 crc kubenswrapper[4880]: I0218 12:03:04.475579 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-458b7" event={"ID":"ba8852c5-a3ee-43ad-b559-14018ecccf33","Type":"ContainerStarted","Data":"eafe3d3e4461fcbd5d2e7eb1eae66154509ce9555a7843660cc07694924731c5"} Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.548022 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-458b7" event={"ID":"ba8852c5-a3ee-43ad-b559-14018ecccf33","Type":"ContainerStarted","Data":"fa96d31e61b8affdce9f2b4fdc43a458b46d17542f82be8da5646aa0f48c2586"} Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.548553 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.549596 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" event={"ID":"478db717-fcd1-4a34-a2ca-c98b0bda52f7","Type":"ContainerStarted","Data":"1f7db5dc1f5561b06d79444a80758d12ef2de3023b4bd595672b1a6b100764f0"} Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.550455 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.551759 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" event={"ID":"043f42bf-3e01-42a2-87b4-999205377d66","Type":"ContainerStarted","Data":"6944412b30621b4ceae459f08edeb33b67add241d3085684b410357d4c330d25"} Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.552201 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.553916 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" event={"ID":"44a09229-68d6-48b2-99c3-cd3d3d9d2d9e","Type":"ContainerStarted","Data":"bb8d264ef0365a2525149f60b2dc2b3a85fe1ea8d7fd8f1fabee9ede37cf83f6"} Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.555504 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" event={"ID":"2924423e-1b79-4c15-b9b2-cb0d1619ad5c","Type":"ContainerStarted","Data":"537e8fbc9633c2f778fbf2f341ff3249ac52b02165a90aa17b2fd22f5c67a43f"} Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.568924 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-458b7" podStartSLOduration=2.784685214 podStartE2EDuration="13.568907057s" podCreationTimestamp="2026-02-18 12:03:02 +0000 UTC" firstStartedPulling="2026-02-18 12:03:03.688818544 +0000 UTC m=+691.117719395" lastFinishedPulling="2026-02-18 12:03:14.473040367 +0000 UTC m=+701.901941238" observedRunningTime="2026-02-18 12:03:15.565211214 +0000 UTC m=+702.994112095" watchObservedRunningTime="2026-02-18 12:03:15.568907057 +0000 UTC m=+702.997807918" Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.581584 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf" podStartSLOduration=2.479975801 podStartE2EDuration="13.581561719s" podCreationTimestamp="2026-02-18 12:03:02 +0000 UTC" firstStartedPulling="2026-02-18 12:03:03.344183192 +0000 UTC m=+690.773084053" lastFinishedPulling="2026-02-18 12:03:14.44576911 +0000 UTC m=+701.874669971" observedRunningTime="2026-02-18 12:03:15.578666899 +0000 UTC m=+703.007567770" watchObservedRunningTime="2026-02-18 12:03:15.581561719 +0000 UTC m=+703.010462590" Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.601560 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-csbj4" podStartSLOduration=2.355385138 podStartE2EDuration="13.601541954s" podCreationTimestamp="2026-02-18 12:03:02 +0000 UTC" firstStartedPulling="2026-02-18 12:03:03.211821542 +0000 UTC m=+690.640722403" lastFinishedPulling="2026-02-18 12:03:14.457978358 +0000 UTC m=+701.886879219" observedRunningTime="2026-02-18 12:03:15.597973315 +0000 UTC m=+703.026874176" watchObservedRunningTime="2026-02-18 12:03:15.601541954 +0000 UTC m=+703.030442815" Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.641537 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-hqkfn" podStartSLOduration=2.783685847 podStartE2EDuration="13.641517886s" podCreationTimestamp="2026-02-18 12:03:02 +0000 UTC" firstStartedPulling="2026-02-18 12:03:03.645037387 +0000 UTC m=+691.073938248" lastFinishedPulling="2026-02-18 12:03:14.502869426 +0000 UTC m=+701.931770287" observedRunningTime="2026-02-18 12:03:15.624879864 +0000 UTC m=+703.053780735" watchObservedRunningTime="2026-02-18 12:03:15.641517886 +0000 UTC m=+703.070418757" Feb 18 12:03:15 crc kubenswrapper[4880]: I0218 12:03:15.654983 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-59dc64659b-m55w8" podStartSLOduration=2.841891155 podStartE2EDuration="13.65496133s" podCreationTimestamp="2026-02-18 12:03:02 +0000 UTC" firstStartedPulling="2026-02-18 12:03:03.632084037 +0000 UTC m=+691.060984898" lastFinishedPulling="2026-02-18 12:03:14.445154212 +0000 UTC m=+701.874055073" observedRunningTime="2026-02-18 12:03:15.64741822 +0000 UTC m=+703.076319081" watchObservedRunningTime="2026-02-18 12:03:15.65496133 +0000 UTC m=+703.083862191" Feb 18 12:03:23 crc kubenswrapper[4880]: I0218 12:03:23.359154 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-458b7" Feb 18 12:03:38 crc kubenswrapper[4880]: I0218 12:03:38.937111 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5"] Feb 18 12:03:38 crc kubenswrapper[4880]: I0218 12:03:38.939053 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:38 crc kubenswrapper[4880]: I0218 12:03:38.942528 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 12:03:38 crc kubenswrapper[4880]: I0218 12:03:38.947262 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5"] Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.007378 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.007439 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.007483 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8rmd\" (UniqueName: \"kubernetes.io/projected/b9452f86-7c6a-45e7-bbe7-7980f815af42-kube-api-access-k8rmd\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.108348 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.108405 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.108446 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8rmd\" (UniqueName: \"kubernetes.io/projected/b9452f86-7c6a-45e7-bbe7-7980f815af42-kube-api-access-k8rmd\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.109386 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.109491 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.131300 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8rmd\" (UniqueName: \"kubernetes.io/projected/b9452f86-7c6a-45e7-bbe7-7980f815af42-kube-api-access-k8rmd\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.257745 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.525612 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5"] Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.694160 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" event={"ID":"b9452f86-7c6a-45e7-bbe7-7980f815af42","Type":"ContainerStarted","Data":"b62fce9bcc2dc8fb67607327d2977a29efbfb42a2d4aed0ca0f74ca6079bab37"} Feb 18 12:03:39 crc kubenswrapper[4880]: I0218 12:03:39.694490 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" event={"ID":"b9452f86-7c6a-45e7-bbe7-7980f815af42","Type":"ContainerStarted","Data":"9fa8d21b0004d0168683c895453fa116db6973662484f7f3cf0dd7cb186c7c24"} Feb 18 12:03:40 crc kubenswrapper[4880]: I0218 12:03:40.699273 4880 generic.go:334] "Generic (PLEG): container finished" podID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerID="b62fce9bcc2dc8fb67607327d2977a29efbfb42a2d4aed0ca0f74ca6079bab37" exitCode=0 Feb 18 12:03:40 crc kubenswrapper[4880]: I0218 12:03:40.699325 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" event={"ID":"b9452f86-7c6a-45e7-bbe7-7980f815af42","Type":"ContainerDied","Data":"b62fce9bcc2dc8fb67607327d2977a29efbfb42a2d4aed0ca0f74ca6079bab37"} Feb 18 12:03:42 crc kubenswrapper[4880]: I0218 12:03:42.714975 4880 generic.go:334] "Generic (PLEG): container finished" podID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerID="c0b426751744c3fb52d5ac26934ebd83004255d97875dcd643ee8cd8bb41112d" exitCode=0 Feb 18 12:03:42 crc kubenswrapper[4880]: I0218 12:03:42.715064 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" event={"ID":"b9452f86-7c6a-45e7-bbe7-7980f815af42","Type":"ContainerDied","Data":"c0b426751744c3fb52d5ac26934ebd83004255d97875dcd643ee8cd8bb41112d"} Feb 18 12:03:43 crc kubenswrapper[4880]: I0218 12:03:43.722655 4880 generic.go:334] "Generic (PLEG): container finished" podID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerID="8b9a96b65f6b81b459f1e082095d8350ec202250274e22cbe0c5b31a2b2d893d" exitCode=0 Feb 18 12:03:43 crc kubenswrapper[4880]: I0218 12:03:43.722695 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" event={"ID":"b9452f86-7c6a-45e7-bbe7-7980f815af42","Type":"ContainerDied","Data":"8b9a96b65f6b81b459f1e082095d8350ec202250274e22cbe0c5b31a2b2d893d"} Feb 18 12:03:44 crc kubenswrapper[4880]: I0218 12:03:44.940589 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.092549 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8rmd\" (UniqueName: \"kubernetes.io/projected/b9452f86-7c6a-45e7-bbe7-7980f815af42-kube-api-access-k8rmd\") pod \"b9452f86-7c6a-45e7-bbe7-7980f815af42\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.092707 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-util\") pod \"b9452f86-7c6a-45e7-bbe7-7980f815af42\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.092732 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-bundle\") pod \"b9452f86-7c6a-45e7-bbe7-7980f815af42\" (UID: \"b9452f86-7c6a-45e7-bbe7-7980f815af42\") " Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.093420 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-bundle" (OuterVolumeSpecName: "bundle") pod "b9452f86-7c6a-45e7-bbe7-7980f815af42" (UID: "b9452f86-7c6a-45e7-bbe7-7980f815af42"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.099933 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9452f86-7c6a-45e7-bbe7-7980f815af42-kube-api-access-k8rmd" (OuterVolumeSpecName: "kube-api-access-k8rmd") pod "b9452f86-7c6a-45e7-bbe7-7980f815af42" (UID: "b9452f86-7c6a-45e7-bbe7-7980f815af42"). InnerVolumeSpecName "kube-api-access-k8rmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.194682 4880 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.194725 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8rmd\" (UniqueName: \"kubernetes.io/projected/b9452f86-7c6a-45e7-bbe7-7980f815af42-kube-api-access-k8rmd\") on node \"crc\" DevicePath \"\"" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.295533 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-util" (OuterVolumeSpecName: "util") pod "b9452f86-7c6a-45e7-bbe7-7980f815af42" (UID: "b9452f86-7c6a-45e7-bbe7-7980f815af42"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.396689 4880 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9452f86-7c6a-45e7-bbe7-7980f815af42-util\") on node \"crc\" DevicePath \"\"" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.737194 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" event={"ID":"b9452f86-7c6a-45e7-bbe7-7980f815af42","Type":"ContainerDied","Data":"9fa8d21b0004d0168683c895453fa116db6973662484f7f3cf0dd7cb186c7c24"} Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.737567 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa8d21b0004d0168683c895453fa116db6973662484f7f3cf0dd7cb186c7c24" Feb 18 12:03:45 crc kubenswrapper[4880]: I0218 12:03:45.737294 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.573972 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-cv2ht"] Feb 18 12:03:50 crc kubenswrapper[4880]: E0218 12:03:50.574580 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerName="util" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.574592 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerName="util" Feb 18 12:03:50 crc kubenswrapper[4880]: E0218 12:03:50.574602 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerName="extract" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.574622 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerName="extract" Feb 18 12:03:50 crc kubenswrapper[4880]: E0218 12:03:50.574639 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerName="pull" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.574645 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerName="pull" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.574743 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9452f86-7c6a-45e7-bbe7-7980f815af42" containerName="extract" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.575150 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.577398 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.577398 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-25g52" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.577529 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.585792 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-cv2ht"] Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.669028 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9vg2\" (UniqueName: \"kubernetes.io/projected/233947b8-94c0-4814-b63d-cfbce95bc5e3-kube-api-access-b9vg2\") pod \"nmstate-operator-694c9596b7-cv2ht\" (UID: \"233947b8-94c0-4814-b63d-cfbce95bc5e3\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.770909 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9vg2\" (UniqueName: \"kubernetes.io/projected/233947b8-94c0-4814-b63d-cfbce95bc5e3-kube-api-access-b9vg2\") pod \"nmstate-operator-694c9596b7-cv2ht\" (UID: \"233947b8-94c0-4814-b63d-cfbce95bc5e3\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.794603 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9vg2\" (UniqueName: \"kubernetes.io/projected/233947b8-94c0-4814-b63d-cfbce95bc5e3-kube-api-access-b9vg2\") pod \"nmstate-operator-694c9596b7-cv2ht\" (UID: \"233947b8-94c0-4814-b63d-cfbce95bc5e3\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" Feb 18 12:03:50 crc kubenswrapper[4880]: I0218 12:03:50.891189 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" Feb 18 12:03:51 crc kubenswrapper[4880]: I0218 12:03:51.140583 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-cv2ht"] Feb 18 12:03:51 crc kubenswrapper[4880]: I0218 12:03:51.770149 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" event={"ID":"233947b8-94c0-4814-b63d-cfbce95bc5e3","Type":"ContainerStarted","Data":"2d68fe5c70aea220a046677ce617671e50f3ceff6b3437e0c0c064cec94e0e82"} Feb 18 12:03:53 crc kubenswrapper[4880]: I0218 12:03:53.274268 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:03:53 crc kubenswrapper[4880]: I0218 12:03:53.274558 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:03:54 crc kubenswrapper[4880]: I0218 12:03:54.793874 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" event={"ID":"233947b8-94c0-4814-b63d-cfbce95bc5e3","Type":"ContainerStarted","Data":"de3224cade55872b09b68e5653c52c9e44023c678946a08d26a1fad4db4e198c"} Feb 18 12:03:54 crc kubenswrapper[4880]: I0218 12:03:54.811367 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-cv2ht" podStartSLOduration=2.304177777 podStartE2EDuration="4.811348085s" podCreationTimestamp="2026-02-18 12:03:50 +0000 UTC" firstStartedPulling="2026-02-18 12:03:51.152133465 +0000 UTC m=+738.581034326" lastFinishedPulling="2026-02-18 12:03:53.659303773 +0000 UTC m=+741.088204634" observedRunningTime="2026-02-18 12:03:54.809273228 +0000 UTC m=+742.238174089" watchObservedRunningTime="2026-02-18 12:03:54.811348085 +0000 UTC m=+742.240248946" Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.959532 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs"] Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.960992 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.962432 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-njkx7" Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.977343 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs"] Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.983125 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-t6282"] Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.984048 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.987016 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.994834 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp5lv\" (UniqueName: \"kubernetes.io/projected/abed6693-ac3d-4890-a45d-c9781587622e-kube-api-access-rp5lv\") pod \"nmstate-metrics-58c85c668d-mwnbs\" (UID: \"abed6693-ac3d-4890-a45d-c9781587622e\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" Feb 18 12:03:59 crc kubenswrapper[4880]: I0218 12:03:59.994930 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-t6282"] Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.026186 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jkq5d"] Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.027072 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.095988 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f"] Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096359 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0dedf240-e215-4de3-8794-b1223146ba9e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-t6282\" (UID: \"0dedf240-e215-4de3-8794-b1223146ba9e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096413 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-nmstate-lock\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096437 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-dbus-socket\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096471 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp5lv\" (UniqueName: \"kubernetes.io/projected/abed6693-ac3d-4890-a45d-c9781587622e-kube-api-access-rp5lv\") pod \"nmstate-metrics-58c85c668d-mwnbs\" (UID: \"abed6693-ac3d-4890-a45d-c9781587622e\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096511 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-ovs-socket\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096728 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq6fs\" (UniqueName: \"kubernetes.io/projected/55b70fd6-133c-41d1-b461-9507862b44fd-kube-api-access-rq6fs\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096778 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.096822 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fww\" (UniqueName: \"kubernetes.io/projected/0dedf240-e215-4de3-8794-b1223146ba9e-kube-api-access-v5fww\") pod \"nmstate-webhook-866bcb46dc-t6282\" (UID: \"0dedf240-e215-4de3-8794-b1223146ba9e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.099792 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5bjfk" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.099823 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.100003 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.119593 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f"] Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.128949 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp5lv\" (UniqueName: \"kubernetes.io/projected/abed6693-ac3d-4890-a45d-c9781587622e-kube-api-access-rp5lv\") pod \"nmstate-metrics-58c85c668d-mwnbs\" (UID: \"abed6693-ac3d-4890-a45d-c9781587622e\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198164 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0dedf240-e215-4de3-8794-b1223146ba9e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-t6282\" (UID: \"0dedf240-e215-4de3-8794-b1223146ba9e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198220 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6303451a-3d25-4508-9af5-a67136a4fe25-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198246 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-nmstate-lock\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198268 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-dbus-socket\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198306 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pgtf\" (UniqueName: \"kubernetes.io/projected/6303451a-3d25-4508-9af5-a67136a4fe25-kube-api-access-2pgtf\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: E0218 12:04:00.198344 4880 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198360 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-nmstate-lock\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: E0218 12:04:00.198487 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dedf240-e215-4de3-8794-b1223146ba9e-tls-key-pair podName:0dedf240-e215-4de3-8794-b1223146ba9e nodeName:}" failed. No retries permitted until 2026-02-18 12:04:00.698447349 +0000 UTC m=+748.127348210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/0dedf240-e215-4de3-8794-b1223146ba9e-tls-key-pair") pod "nmstate-webhook-866bcb46dc-t6282" (UID: "0dedf240-e215-4de3-8794-b1223146ba9e") : secret "openshift-nmstate-webhook" not found Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198534 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6303451a-3d25-4508-9af5-a67136a4fe25-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198568 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-dbus-socket\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198585 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-ovs-socket\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198636 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b70fd6-133c-41d1-b461-9507862b44fd-ovs-socket\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198648 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq6fs\" (UniqueName: \"kubernetes.io/projected/55b70fd6-133c-41d1-b461-9507862b44fd-kube-api-access-rq6fs\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.198679 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fww\" (UniqueName: \"kubernetes.io/projected/0dedf240-e215-4de3-8794-b1223146ba9e-kube-api-access-v5fww\") pod \"nmstate-webhook-866bcb46dc-t6282\" (UID: \"0dedf240-e215-4de3-8794-b1223146ba9e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.215213 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq6fs\" (UniqueName: \"kubernetes.io/projected/55b70fd6-133c-41d1-b461-9507862b44fd-kube-api-access-rq6fs\") pod \"nmstate-handler-jkq5d\" (UID: \"55b70fd6-133c-41d1-b461-9507862b44fd\") " pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.220989 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fww\" (UniqueName: \"kubernetes.io/projected/0dedf240-e215-4de3-8794-b1223146ba9e-kube-api-access-v5fww\") pod \"nmstate-webhook-866bcb46dc-t6282\" (UID: \"0dedf240-e215-4de3-8794-b1223146ba9e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.276144 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.301248 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bcd9774b9-xkc25"] Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.301995 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.303077 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pgtf\" (UniqueName: \"kubernetes.io/projected/6303451a-3d25-4508-9af5-a67136a4fe25-kube-api-access-2pgtf\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.303117 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6303451a-3d25-4508-9af5-a67136a4fe25-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.303181 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6303451a-3d25-4508-9af5-a67136a4fe25-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.304000 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6303451a-3d25-4508-9af5-a67136a4fe25-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.308963 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6303451a-3d25-4508-9af5-a67136a4fe25-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.323374 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pgtf\" (UniqueName: \"kubernetes.io/projected/6303451a-3d25-4508-9af5-a67136a4fe25-kube-api-access-2pgtf\") pod \"nmstate-console-plugin-5c78fc5d65-sth8f\" (UID: \"6303451a-3d25-4508-9af5-a67136a4fe25\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.346766 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.351785 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcd9774b9-xkc25"] Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.406440 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-trusted-ca-bundle\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.406501 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-config\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.406536 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsw7p\" (UniqueName: \"kubernetes.io/projected/f2085047-5bac-473e-8fa5-1bac184bc7e5-kube-api-access-fsw7p\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.406563 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-oauth-serving-cert\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.406603 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-service-ca\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.406693 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-oauth-config\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.406720 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-serving-cert\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.413326 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.488832 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs"] Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.508599 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-oauth-config\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.508712 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-serving-cert\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.508766 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-trusted-ca-bundle\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.508787 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-config\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.508810 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsw7p\" (UniqueName: \"kubernetes.io/projected/f2085047-5bac-473e-8fa5-1bac184bc7e5-kube-api-access-fsw7p\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.508868 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-oauth-serving-cert\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.508901 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-service-ca\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.509883 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-service-ca\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.511453 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-config\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.512055 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-oauth-serving-cert\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.512471 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2085047-5bac-473e-8fa5-1bac184bc7e5-trusted-ca-bundle\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.515289 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-serving-cert\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.516489 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2085047-5bac-473e-8fa5-1bac184bc7e5-console-oauth-config\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.533106 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsw7p\" (UniqueName: \"kubernetes.io/projected/f2085047-5bac-473e-8fa5-1bac184bc7e5-kube-api-access-fsw7p\") pod \"console-7bcd9774b9-xkc25\" (UID: \"f2085047-5bac-473e-8fa5-1bac184bc7e5\") " pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.616142 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f"] Feb 18 12:04:00 crc kubenswrapper[4880]: W0218 12:04:00.617911 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6303451a_3d25_4508_9af5_a67136a4fe25.slice/crio-2b25d6affd1e9fd91c8ab3c6d989e9e3668f549f5100cc283e198641e242d8d5 WatchSource:0}: Error finding container 2b25d6affd1e9fd91c8ab3c6d989e9e3668f549f5100cc283e198641e242d8d5: Status 404 returned error can't find the container with id 2b25d6affd1e9fd91c8ab3c6d989e9e3668f549f5100cc283e198641e242d8d5 Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.661574 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.712209 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0dedf240-e215-4de3-8794-b1223146ba9e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-t6282\" (UID: \"0dedf240-e215-4de3-8794-b1223146ba9e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.719441 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0dedf240-e215-4de3-8794-b1223146ba9e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-t6282\" (UID: \"0dedf240-e215-4de3-8794-b1223146ba9e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.830714 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" event={"ID":"abed6693-ac3d-4890-a45d-c9781587622e","Type":"ContainerStarted","Data":"c41c39d5616b5c05e3037ca726692fc06c295f1803aad966371845a4d3dcb244"} Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.833393 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" event={"ID":"6303451a-3d25-4508-9af5-a67136a4fe25","Type":"ContainerStarted","Data":"2b25d6affd1e9fd91c8ab3c6d989e9e3668f549f5100cc283e198641e242d8d5"} Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.834689 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jkq5d" event={"ID":"55b70fd6-133c-41d1-b461-9507862b44fd","Type":"ContainerStarted","Data":"d391d6147fbee8078f07d52289b233618d3a8eb12fccac29260002478c597751"} Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.842755 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcd9774b9-xkc25"] Feb 18 12:04:00 crc kubenswrapper[4880]: W0218 12:04:00.845294 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2085047_5bac_473e_8fa5_1bac184bc7e5.slice/crio-f3c3b0ce892743c05d6cd6a98a02811e0ff80de80926c358f0cc86d1be2fb549 WatchSource:0}: Error finding container f3c3b0ce892743c05d6cd6a98a02811e0ff80de80926c358f0cc86d1be2fb549: Status 404 returned error can't find the container with id f3c3b0ce892743c05d6cd6a98a02811e0ff80de80926c358f0cc86d1be2fb549 Feb 18 12:04:00 crc kubenswrapper[4880]: I0218 12:04:00.929451 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:01 crc kubenswrapper[4880]: I0218 12:04:01.187941 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-t6282"] Feb 18 12:04:01 crc kubenswrapper[4880]: W0218 12:04:01.191477 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dedf240_e215_4de3_8794_b1223146ba9e.slice/crio-194b6ca048ebdaed91615903fa62709c1ba077e9ffc83c2b7af05e77a61be6cb WatchSource:0}: Error finding container 194b6ca048ebdaed91615903fa62709c1ba077e9ffc83c2b7af05e77a61be6cb: Status 404 returned error can't find the container with id 194b6ca048ebdaed91615903fa62709c1ba077e9ffc83c2b7af05e77a61be6cb Feb 18 12:04:01 crc kubenswrapper[4880]: I0218 12:04:01.843879 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcd9774b9-xkc25" event={"ID":"f2085047-5bac-473e-8fa5-1bac184bc7e5","Type":"ContainerStarted","Data":"274ab8375432584e955dfcbf0c40583078429558d954713ed28e0ec224de9c56"} Feb 18 12:04:01 crc kubenswrapper[4880]: I0218 12:04:01.844253 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcd9774b9-xkc25" event={"ID":"f2085047-5bac-473e-8fa5-1bac184bc7e5","Type":"ContainerStarted","Data":"f3c3b0ce892743c05d6cd6a98a02811e0ff80de80926c358f0cc86d1be2fb549"} Feb 18 12:04:01 crc kubenswrapper[4880]: I0218 12:04:01.845418 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" event={"ID":"0dedf240-e215-4de3-8794-b1223146ba9e","Type":"ContainerStarted","Data":"194b6ca048ebdaed91615903fa62709c1ba077e9ffc83c2b7af05e77a61be6cb"} Feb 18 12:04:01 crc kubenswrapper[4880]: I0218 12:04:01.867036 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bcd9774b9-xkc25" podStartSLOduration=1.8670108810000001 podStartE2EDuration="1.867010881s" podCreationTimestamp="2026-02-18 12:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:04:01.863141784 +0000 UTC m=+749.292042665" watchObservedRunningTime="2026-02-18 12:04:01.867010881 +0000 UTC m=+749.295911742" Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.865370 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jkq5d" event={"ID":"55b70fd6-133c-41d1-b461-9507862b44fd","Type":"ContainerStarted","Data":"546272d1a2f6064a1c97d14ea3e065938e1189561d532dbf2775069661db1f21"} Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.865966 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.867213 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" event={"ID":"abed6693-ac3d-4890-a45d-c9781587622e","Type":"ContainerStarted","Data":"060e2f905e5483e94d5d7f19be74360311b9c67fa12c0173da76cc621a55a6e6"} Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.869248 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" event={"ID":"6303451a-3d25-4508-9af5-a67136a4fe25","Type":"ContainerStarted","Data":"5b8e4e999717ea80353af3e3e28aa79bc93929fada9e319a7ca679ef6df1df44"} Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.871399 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" event={"ID":"0dedf240-e215-4de3-8794-b1223146ba9e","Type":"ContainerStarted","Data":"6c2498fe24b186c0d3020608211402617c3de6aa3a042c69a05e9c9b4bbcc0e2"} Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.871544 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.879874 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jkq5d" podStartSLOduration=2.009427708 podStartE2EDuration="5.87985482s" podCreationTimestamp="2026-02-18 12:03:59 +0000 UTC" firstStartedPulling="2026-02-18 12:04:00.384053089 +0000 UTC m=+747.812953950" lastFinishedPulling="2026-02-18 12:04:04.254480201 +0000 UTC m=+751.683381062" observedRunningTime="2026-02-18 12:04:04.878541673 +0000 UTC m=+752.307442534" watchObservedRunningTime="2026-02-18 12:04:04.87985482 +0000 UTC m=+752.308755681" Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.913988 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sth8f" podStartSLOduration=1.2782888620000001 podStartE2EDuration="4.913962407s" podCreationTimestamp="2026-02-18 12:04:00 +0000 UTC" firstStartedPulling="2026-02-18 12:04:00.621327177 +0000 UTC m=+748.050228038" lastFinishedPulling="2026-02-18 12:04:04.257000712 +0000 UTC m=+751.685901583" observedRunningTime="2026-02-18 12:04:04.891089672 +0000 UTC m=+752.319990543" watchObservedRunningTime="2026-02-18 12:04:04.913962407 +0000 UTC m=+752.342863278" Feb 18 12:04:04 crc kubenswrapper[4880]: I0218 12:04:04.916622 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" podStartSLOduration=2.853577359 podStartE2EDuration="5.916586301s" podCreationTimestamp="2026-02-18 12:03:59 +0000 UTC" firstStartedPulling="2026-02-18 12:04:01.194118483 +0000 UTC m=+748.623019344" lastFinishedPulling="2026-02-18 12:04:04.257127425 +0000 UTC m=+751.686028286" observedRunningTime="2026-02-18 12:04:04.909316409 +0000 UTC m=+752.338217270" watchObservedRunningTime="2026-02-18 12:04:04.916586301 +0000 UTC m=+752.345487162" Feb 18 12:04:06 crc kubenswrapper[4880]: I0218 12:04:06.885867 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" event={"ID":"abed6693-ac3d-4890-a45d-c9781587622e","Type":"ContainerStarted","Data":"29bc4b8bfe1e4784349a80a6b179c610e5f4091e219cd97fd3226f88b6b45a83"} Feb 18 12:04:06 crc kubenswrapper[4880]: I0218 12:04:06.901675 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mwnbs" podStartSLOduration=1.7383424490000001 podStartE2EDuration="7.901655193s" podCreationTimestamp="2026-02-18 12:03:59 +0000 UTC" firstStartedPulling="2026-02-18 12:04:00.49631084 +0000 UTC m=+747.925211701" lastFinishedPulling="2026-02-18 12:04:06.659623584 +0000 UTC m=+754.088524445" observedRunningTime="2026-02-18 12:04:06.901383366 +0000 UTC m=+754.330284247" watchObservedRunningTime="2026-02-18 12:04:06.901655193 +0000 UTC m=+754.330556054" Feb 18 12:04:10 crc kubenswrapper[4880]: I0218 12:04:10.121221 4880 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 18 12:04:10 crc kubenswrapper[4880]: I0218 12:04:10.372343 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jkq5d" Feb 18 12:04:10 crc kubenswrapper[4880]: I0218 12:04:10.662664 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:10 crc kubenswrapper[4880]: I0218 12:04:10.662709 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:10 crc kubenswrapper[4880]: I0218 12:04:10.666714 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:10 crc kubenswrapper[4880]: I0218 12:04:10.912930 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bcd9774b9-xkc25" Feb 18 12:04:10 crc kubenswrapper[4880]: I0218 12:04:10.964384 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6qqbh"] Feb 18 12:04:20 crc kubenswrapper[4880]: I0218 12:04:20.936532 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-t6282" Feb 18 12:04:23 crc kubenswrapper[4880]: I0218 12:04:23.274230 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:04:23 crc kubenswrapper[4880]: I0218 12:04:23.274581 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.697739 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n"] Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.699658 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.704199 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.729457 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n"] Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.776187 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.776272 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.776313 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pncl2\" (UniqueName: \"kubernetes.io/projected/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-kube-api-access-pncl2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.877928 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.877971 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.877998 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pncl2\" (UniqueName: \"kubernetes.io/projected/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-kube-api-access-pncl2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.878480 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.879255 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:34 crc kubenswrapper[4880]: I0218 12:04:34.904861 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pncl2\" (UniqueName: \"kubernetes.io/projected/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-kube-api-access-pncl2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:35 crc kubenswrapper[4880]: I0218 12:04:35.017970 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:35 crc kubenswrapper[4880]: I0218 12:04:35.256648 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n"] Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.005519 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-6qqbh" podUID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" containerName="console" containerID="cri-o://9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d" gracePeriod=15 Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.073461 4880 generic.go:334] "Generic (PLEG): container finished" podID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerID="77159f0a4611196b73037a1e61b3096d294ba45c9faad7fa6ee0e27b675f8ac3" exitCode=0 Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.073518 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" event={"ID":"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d","Type":"ContainerDied","Data":"77159f0a4611196b73037a1e61b3096d294ba45c9faad7fa6ee0e27b675f8ac3"} Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.073549 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" event={"ID":"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d","Type":"ContainerStarted","Data":"dbe78a9f55da11a8994403b6c7120322c78e4dab02bd197cbd14e05549f2c8fb"} Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.397099 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6qqbh_0032e4f7-d08f-4fe5-890c-f02eb48a7f86/console/0.log" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.397177 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.501517 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-oauth-config\") pod \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.501572 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-trusted-ca-bundle\") pod \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.501594 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-config\") pod \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.501664 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-service-ca\") pod \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.501693 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-oauth-serving-cert\") pod \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.501739 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-serving-cert\") pod \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.501790 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk7xx\" (UniqueName: \"kubernetes.io/projected/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-kube-api-access-tk7xx\") pod \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\" (UID: \"0032e4f7-d08f-4fe5-890c-f02eb48a7f86\") " Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.502466 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0032e4f7-d08f-4fe5-890c-f02eb48a7f86" (UID: "0032e4f7-d08f-4fe5-890c-f02eb48a7f86"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.502755 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-service-ca" (OuterVolumeSpecName: "service-ca") pod "0032e4f7-d08f-4fe5-890c-f02eb48a7f86" (UID: "0032e4f7-d08f-4fe5-890c-f02eb48a7f86"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.503045 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-config" (OuterVolumeSpecName: "console-config") pod "0032e4f7-d08f-4fe5-890c-f02eb48a7f86" (UID: "0032e4f7-d08f-4fe5-890c-f02eb48a7f86"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.503057 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0032e4f7-d08f-4fe5-890c-f02eb48a7f86" (UID: "0032e4f7-d08f-4fe5-890c-f02eb48a7f86"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.508425 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0032e4f7-d08f-4fe5-890c-f02eb48a7f86" (UID: "0032e4f7-d08f-4fe5-890c-f02eb48a7f86"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.508572 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-kube-api-access-tk7xx" (OuterVolumeSpecName: "kube-api-access-tk7xx") pod "0032e4f7-d08f-4fe5-890c-f02eb48a7f86" (UID: "0032e4f7-d08f-4fe5-890c-f02eb48a7f86"). InnerVolumeSpecName "kube-api-access-tk7xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.509188 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0032e4f7-d08f-4fe5-890c-f02eb48a7f86" (UID: "0032e4f7-d08f-4fe5-890c-f02eb48a7f86"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.603665 4880 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.603706 4880 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.603716 4880 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.603725 4880 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.603734 4880 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.603742 4880 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:36 crc kubenswrapper[4880]: I0218 12:04:36.603750 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk7xx\" (UniqueName: \"kubernetes.io/projected/0032e4f7-d08f-4fe5-890c-f02eb48a7f86-kube-api-access-tk7xx\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.079954 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6qqbh_0032e4f7-d08f-4fe5-890c-f02eb48a7f86/console/0.log" Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.080321 4880 generic.go:334] "Generic (PLEG): container finished" podID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" containerID="9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d" exitCode=2 Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.080360 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6qqbh" event={"ID":"0032e4f7-d08f-4fe5-890c-f02eb48a7f86","Type":"ContainerDied","Data":"9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d"} Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.080389 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6qqbh" Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.080402 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6qqbh" event={"ID":"0032e4f7-d08f-4fe5-890c-f02eb48a7f86","Type":"ContainerDied","Data":"73fd2c212dc4ee03146390754fc50d55e22b0525879aa567f761282a40c83869"} Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.080438 4880 scope.go:117] "RemoveContainer" containerID="9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d" Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.109157 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6qqbh"] Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.112365 4880 scope.go:117] "RemoveContainer" containerID="9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d" Feb 18 12:04:37 crc kubenswrapper[4880]: E0218 12:04:37.112899 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d\": container with ID starting with 9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d not found: ID does not exist" containerID="9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d" Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.112959 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d"} err="failed to get container status \"9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d\": rpc error: code = NotFound desc = could not find container \"9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d\": container with ID starting with 9856e90fa050146995c42b02cc03247c98425c2ffa9f74e67bb731f316ca595d not found: ID does not exist" Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.116989 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-6qqbh"] Feb 18 12:04:37 crc kubenswrapper[4880]: I0218 12:04:37.186887 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" path="/var/lib/kubelet/pods/0032e4f7-d08f-4fe5-890c-f02eb48a7f86/volumes" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.239761 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xh65x"] Feb 18 12:04:38 crc kubenswrapper[4880]: E0218 12:04:38.240980 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" containerName="console" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.241075 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" containerName="console" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.241297 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="0032e4f7-d08f-4fe5-890c-f02eb48a7f86" containerName="console" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.242512 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.254172 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xh65x"] Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.425457 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-utilities\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.425510 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-catalog-content\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.425742 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxnbt\" (UniqueName: \"kubernetes.io/projected/4b71544d-3f96-4d18-b802-95d94280b841-kube-api-access-fxnbt\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.527087 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-catalog-content\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.527187 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxnbt\" (UniqueName: \"kubernetes.io/projected/4b71544d-3f96-4d18-b802-95d94280b841-kube-api-access-fxnbt\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.527224 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-utilities\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.527546 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-catalog-content\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.527634 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-utilities\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.570594 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxnbt\" (UniqueName: \"kubernetes.io/projected/4b71544d-3f96-4d18-b802-95d94280b841-kube-api-access-fxnbt\") pod \"redhat-operators-xh65x\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:38 crc kubenswrapper[4880]: I0218 12:04:38.862047 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:39 crc kubenswrapper[4880]: I0218 12:04:39.096269 4880 generic.go:334] "Generic (PLEG): container finished" podID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerID="a3148b3fa56363a3bb0c4a603aaa8840d50d96808a0f56c8e3ae483203029ad6" exitCode=0 Feb 18 12:04:39 crc kubenswrapper[4880]: I0218 12:04:39.096311 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" event={"ID":"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d","Type":"ContainerDied","Data":"a3148b3fa56363a3bb0c4a603aaa8840d50d96808a0f56c8e3ae483203029ad6"} Feb 18 12:04:39 crc kubenswrapper[4880]: I0218 12:04:39.319511 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xh65x"] Feb 18 12:04:40 crc kubenswrapper[4880]: I0218 12:04:40.104101 4880 generic.go:334] "Generic (PLEG): container finished" podID="4b71544d-3f96-4d18-b802-95d94280b841" containerID="c860f83839d05268be5227dcc428f0611108f43a551db5abbbca6b03fe5b2ce0" exitCode=0 Feb 18 12:04:40 crc kubenswrapper[4880]: I0218 12:04:40.104147 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh65x" event={"ID":"4b71544d-3f96-4d18-b802-95d94280b841","Type":"ContainerDied","Data":"c860f83839d05268be5227dcc428f0611108f43a551db5abbbca6b03fe5b2ce0"} Feb 18 12:04:40 crc kubenswrapper[4880]: I0218 12:04:40.104462 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh65x" event={"ID":"4b71544d-3f96-4d18-b802-95d94280b841","Type":"ContainerStarted","Data":"efcb895ec0986b398564e0037798aaa3c08b8504ee742d1dd6b59faf727b517e"} Feb 18 12:04:40 crc kubenswrapper[4880]: I0218 12:04:40.107652 4880 generic.go:334] "Generic (PLEG): container finished" podID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerID="1d87d6710e5f60d70b9f7fec5b037a58d5eb381e3c944692bd31d6c41df34121" exitCode=0 Feb 18 12:04:40 crc kubenswrapper[4880]: I0218 12:04:40.107684 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" event={"ID":"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d","Type":"ContainerDied","Data":"1d87d6710e5f60d70b9f7fec5b037a58d5eb381e3c944692bd31d6c41df34121"} Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.360276 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.479848 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-util\") pod \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.480154 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-bundle\") pod \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.480257 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pncl2\" (UniqueName: \"kubernetes.io/projected/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-kube-api-access-pncl2\") pod \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\" (UID: \"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d\") " Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.481395 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-bundle" (OuterVolumeSpecName: "bundle") pod "3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" (UID: "3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.485802 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-kube-api-access-pncl2" (OuterVolumeSpecName: "kube-api-access-pncl2") pod "3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" (UID: "3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d"). InnerVolumeSpecName "kube-api-access-pncl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.490633 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-util" (OuterVolumeSpecName: "util") pod "3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" (UID: "3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.581999 4880 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-util\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.582056 4880 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:41 crc kubenswrapper[4880]: I0218 12:04:41.582071 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pncl2\" (UniqueName: \"kubernetes.io/projected/3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d-kube-api-access-pncl2\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:42 crc kubenswrapper[4880]: I0218 12:04:42.120331 4880 generic.go:334] "Generic (PLEG): container finished" podID="4b71544d-3f96-4d18-b802-95d94280b841" containerID="16dbea26dadcf23bf78415b3e3d162e06e8acf061624fd9ea75afd0c14fbb017" exitCode=0 Feb 18 12:04:42 crc kubenswrapper[4880]: I0218 12:04:42.120411 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh65x" event={"ID":"4b71544d-3f96-4d18-b802-95d94280b841","Type":"ContainerDied","Data":"16dbea26dadcf23bf78415b3e3d162e06e8acf061624fd9ea75afd0c14fbb017"} Feb 18 12:04:42 crc kubenswrapper[4880]: I0218 12:04:42.124030 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" event={"ID":"3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d","Type":"ContainerDied","Data":"dbe78a9f55da11a8994403b6c7120322c78e4dab02bd197cbd14e05549f2c8fb"} Feb 18 12:04:42 crc kubenswrapper[4880]: I0218 12:04:42.124079 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe78a9f55da11a8994403b6c7120322c78e4dab02bd197cbd14e05549f2c8fb" Feb 18 12:04:42 crc kubenswrapper[4880]: I0218 12:04:42.124164 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n" Feb 18 12:04:43 crc kubenswrapper[4880]: I0218 12:04:43.142814 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh65x" event={"ID":"4b71544d-3f96-4d18-b802-95d94280b841","Type":"ContainerStarted","Data":"9146fd2413f24775e08e242cd399f828319d88170f318e4e331e328376992f61"} Feb 18 12:04:43 crc kubenswrapper[4880]: I0218 12:04:43.163780 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xh65x" podStartSLOduration=2.763179912 podStartE2EDuration="5.163753107s" podCreationTimestamp="2026-02-18 12:04:38 +0000 UTC" firstStartedPulling="2026-02-18 12:04:40.106123734 +0000 UTC m=+787.535024595" lastFinishedPulling="2026-02-18 12:04:42.506696929 +0000 UTC m=+789.935597790" observedRunningTime="2026-02-18 12:04:43.158445479 +0000 UTC m=+790.587346340" watchObservedRunningTime="2026-02-18 12:04:43.163753107 +0000 UTC m=+790.592653968" Feb 18 12:04:48 crc kubenswrapper[4880]: I0218 12:04:48.863209 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:48 crc kubenswrapper[4880]: I0218 12:04:48.863886 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:48 crc kubenswrapper[4880]: I0218 12:04:48.906173 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:49 crc kubenswrapper[4880]: I0218 12:04:49.225328 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:50 crc kubenswrapper[4880]: I0218 12:04:50.828753 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xh65x"] Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.191894 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xh65x" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="registry-server" containerID="cri-o://9146fd2413f24775e08e242cd399f828319d88170f318e4e331e328376992f61" gracePeriod=2 Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.803043 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s"] Feb 18 12:04:51 crc kubenswrapper[4880]: E0218 12:04:51.803720 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerName="extract" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.803741 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerName="extract" Feb 18 12:04:51 crc kubenswrapper[4880]: E0218 12:04:51.803765 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerName="util" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.803775 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerName="util" Feb 18 12:04:51 crc kubenswrapper[4880]: E0218 12:04:51.803785 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerName="pull" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.803793 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerName="pull" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.803937 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d" containerName="extract" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.804472 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.810222 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.810229 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.810249 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-m6pbb" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.810450 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.810447 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.824702 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9f4k\" (UniqueName: \"kubernetes.io/projected/e087df23-40b0-4ce0-bd5c-fc2431364d8d-kube-api-access-v9f4k\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.824763 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e087df23-40b0-4ce0-bd5c-fc2431364d8d-webhook-cert\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.824805 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e087df23-40b0-4ce0-bd5c-fc2431364d8d-apiservice-cert\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.825598 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s"] Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.926090 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e087df23-40b0-4ce0-bd5c-fc2431364d8d-apiservice-cert\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.926313 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9f4k\" (UniqueName: \"kubernetes.io/projected/e087df23-40b0-4ce0-bd5c-fc2431364d8d-kube-api-access-v9f4k\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.926360 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e087df23-40b0-4ce0-bd5c-fc2431364d8d-webhook-cert\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.932072 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e087df23-40b0-4ce0-bd5c-fc2431364d8d-webhook-cert\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.935178 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e087df23-40b0-4ce0-bd5c-fc2431364d8d-apiservice-cert\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:51 crc kubenswrapper[4880]: I0218 12:04:51.942737 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9f4k\" (UniqueName: \"kubernetes.io/projected/e087df23-40b0-4ce0-bd5c-fc2431364d8d-kube-api-access-v9f4k\") pod \"metallb-operator-controller-manager-5d4f87c4b6-kq42s\" (UID: \"e087df23-40b0-4ce0-bd5c-fc2431364d8d\") " pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.055087 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl"] Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.056096 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.057579 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-22h52" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.057961 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.058290 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.081357 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl"] Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.126390 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.130474 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwv9\" (UniqueName: \"kubernetes.io/projected/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-kube-api-access-pvwv9\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.130577 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-apiservice-cert\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.130635 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-webhook-cert\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.232762 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwv9\" (UniqueName: \"kubernetes.io/projected/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-kube-api-access-pvwv9\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.232878 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-apiservice-cert\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.232923 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-webhook-cert\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.239183 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-webhook-cert\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.247336 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-apiservice-cert\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.250243 4880 generic.go:334] "Generic (PLEG): container finished" podID="4b71544d-3f96-4d18-b802-95d94280b841" containerID="9146fd2413f24775e08e242cd399f828319d88170f318e4e331e328376992f61" exitCode=0 Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.250288 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh65x" event={"ID":"4b71544d-3f96-4d18-b802-95d94280b841","Type":"ContainerDied","Data":"9146fd2413f24775e08e242cd399f828319d88170f318e4e331e328376992f61"} Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.252140 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.264570 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwv9\" (UniqueName: \"kubernetes.io/projected/fbb8b5bd-7246-423b-b444-cbb33f9bdad3-kube-api-access-pvwv9\") pod \"metallb-operator-webhook-server-c6d8c54dd-zjnzl\" (UID: \"fbb8b5bd-7246-423b-b444-cbb33f9bdad3\") " pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.381308 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.436503 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxnbt\" (UniqueName: \"kubernetes.io/projected/4b71544d-3f96-4d18-b802-95d94280b841-kube-api-access-fxnbt\") pod \"4b71544d-3f96-4d18-b802-95d94280b841\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.436649 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-catalog-content\") pod \"4b71544d-3f96-4d18-b802-95d94280b841\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.436727 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-utilities\") pod \"4b71544d-3f96-4d18-b802-95d94280b841\" (UID: \"4b71544d-3f96-4d18-b802-95d94280b841\") " Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.438327 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-utilities" (OuterVolumeSpecName: "utilities") pod "4b71544d-3f96-4d18-b802-95d94280b841" (UID: "4b71544d-3f96-4d18-b802-95d94280b841"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.441726 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b71544d-3f96-4d18-b802-95d94280b841-kube-api-access-fxnbt" (OuterVolumeSpecName: "kube-api-access-fxnbt") pod "4b71544d-3f96-4d18-b802-95d94280b841" (UID: "4b71544d-3f96-4d18-b802-95d94280b841"). InnerVolumeSpecName "kube-api-access-fxnbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.529501 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s"] Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.540484 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.540513 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxnbt\" (UniqueName: \"kubernetes.io/projected/4b71544d-3f96-4d18-b802-95d94280b841-kube-api-access-fxnbt\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:52 crc kubenswrapper[4880]: W0218 12:04:52.550963 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode087df23_40b0_4ce0_bd5c_fc2431364d8d.slice/crio-bc5767847e27270d13ec97b696fe9302b3c00cbe0cbe3bff3a408e02f53f6f70 WatchSource:0}: Error finding container bc5767847e27270d13ec97b696fe9302b3c00cbe0cbe3bff3a408e02f53f6f70: Status 404 returned error can't find the container with id bc5767847e27270d13ec97b696fe9302b3c00cbe0cbe3bff3a408e02f53f6f70 Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.571439 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b71544d-3f96-4d18-b802-95d94280b841" (UID: "4b71544d-3f96-4d18-b802-95d94280b841"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.641861 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71544d-3f96-4d18-b802-95d94280b841-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:04:52 crc kubenswrapper[4880]: I0218 12:04:52.697048 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl"] Feb 18 12:04:52 crc kubenswrapper[4880]: W0218 12:04:52.719527 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbb8b5bd_7246_423b_b444_cbb33f9bdad3.slice/crio-f0d6717f2655db7d3a8bc538676729a7440ce5d698268f635aa2f95c42e3b774 WatchSource:0}: Error finding container f0d6717f2655db7d3a8bc538676729a7440ce5d698268f635aa2f95c42e3b774: Status 404 returned error can't find the container with id f0d6717f2655db7d3a8bc538676729a7440ce5d698268f635aa2f95c42e3b774 Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.257688 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" event={"ID":"fbb8b5bd-7246-423b-b444-cbb33f9bdad3","Type":"ContainerStarted","Data":"f0d6717f2655db7d3a8bc538676729a7440ce5d698268f635aa2f95c42e3b774"} Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.260062 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" event={"ID":"e087df23-40b0-4ce0-bd5c-fc2431364d8d","Type":"ContainerStarted","Data":"bc5767847e27270d13ec97b696fe9302b3c00cbe0cbe3bff3a408e02f53f6f70"} Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.262523 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh65x" event={"ID":"4b71544d-3f96-4d18-b802-95d94280b841","Type":"ContainerDied","Data":"efcb895ec0986b398564e0037798aaa3c08b8504ee742d1dd6b59faf727b517e"} Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.262555 4880 scope.go:117] "RemoveContainer" containerID="9146fd2413f24775e08e242cd399f828319d88170f318e4e331e328376992f61" Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.262706 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh65x" Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.273693 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.273740 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.273774 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.274312 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffb3f920a68020d12bee188c9effd93292465486a03ad6482f8c49de81c5a836"} pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.274360 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" containerID="cri-o://ffb3f920a68020d12bee188c9effd93292465486a03ad6482f8c49de81c5a836" gracePeriod=600 Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.281971 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xh65x"] Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.283911 4880 scope.go:117] "RemoveContainer" containerID="16dbea26dadcf23bf78415b3e3d162e06e8acf061624fd9ea75afd0c14fbb017" Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.286818 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xh65x"] Feb 18 12:04:53 crc kubenswrapper[4880]: I0218 12:04:53.306413 4880 scope.go:117] "RemoveContainer" containerID="c860f83839d05268be5227dcc428f0611108f43a551db5abbbca6b03fe5b2ce0" Feb 18 12:04:54 crc kubenswrapper[4880]: I0218 12:04:54.276593 4880 generic.go:334] "Generic (PLEG): container finished" podID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerID="ffb3f920a68020d12bee188c9effd93292465486a03ad6482f8c49de81c5a836" exitCode=0 Feb 18 12:04:54 crc kubenswrapper[4880]: I0218 12:04:54.276645 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerDied","Data":"ffb3f920a68020d12bee188c9effd93292465486a03ad6482f8c49de81c5a836"} Feb 18 12:04:54 crc kubenswrapper[4880]: I0218 12:04:54.276925 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"d675a88a38ecd91c14a002c1ada584ee97f355276bf8e2925348d67f9f1846e3"} Feb 18 12:04:54 crc kubenswrapper[4880]: I0218 12:04:54.276948 4880 scope.go:117] "RemoveContainer" containerID="ed8d7757e9e56a505647e83dce073d85ff43cb245ee9dfe28ed0b00f01fd6740" Feb 18 12:04:55 crc kubenswrapper[4880]: I0218 12:04:55.194809 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b71544d-3f96-4d18-b802-95d94280b841" path="/var/lib/kubelet/pods/4b71544d-3f96-4d18-b802-95d94280b841/volumes" Feb 18 12:04:56 crc kubenswrapper[4880]: I0218 12:04:56.295091 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" event={"ID":"e087df23-40b0-4ce0-bd5c-fc2431364d8d","Type":"ContainerStarted","Data":"da4f2c7c733e4a5df1a415460249b25a5beb9e1c9c845fd075978289d226433f"} Feb 18 12:04:56 crc kubenswrapper[4880]: I0218 12:04:56.315328 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" podStartSLOduration=2.469334981 podStartE2EDuration="5.31531162s" podCreationTimestamp="2026-02-18 12:04:51 +0000 UTC" firstStartedPulling="2026-02-18 12:04:52.556042608 +0000 UTC m=+799.984943479" lastFinishedPulling="2026-02-18 12:04:55.402019237 +0000 UTC m=+802.830920118" observedRunningTime="2026-02-18 12:04:56.314175168 +0000 UTC m=+803.743076039" watchObservedRunningTime="2026-02-18 12:04:56.31531162 +0000 UTC m=+803.744212481" Feb 18 12:04:57 crc kubenswrapper[4880]: I0218 12:04:57.302693 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:04:58 crc kubenswrapper[4880]: I0218 12:04:58.309961 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" event={"ID":"fbb8b5bd-7246-423b-b444-cbb33f9bdad3","Type":"ContainerStarted","Data":"9aafcc243afc7e36b301771e59c993e8a6084c91380789836185f2557fd71cee"} Feb 18 12:04:58 crc kubenswrapper[4880]: I0218 12:04:58.327060 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" podStartSLOduration=1.756480805 podStartE2EDuration="6.327041754s" podCreationTimestamp="2026-02-18 12:04:52 +0000 UTC" firstStartedPulling="2026-02-18 12:04:52.722133146 +0000 UTC m=+800.151033997" lastFinishedPulling="2026-02-18 12:04:57.292694085 +0000 UTC m=+804.721594946" observedRunningTime="2026-02-18 12:04:58.324780861 +0000 UTC m=+805.753681732" watchObservedRunningTime="2026-02-18 12:04:58.327041754 +0000 UTC m=+805.755942615" Feb 18 12:04:59 crc kubenswrapper[4880]: I0218 12:04:59.315752 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:05:12 crc kubenswrapper[4880]: I0218 12:05:12.387797 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c6d8c54dd-zjnzl" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.130342 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5d4f87c4b6-kq42s" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.809864 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fhz9j"] Feb 18 12:05:32 crc kubenswrapper[4880]: E0218 12:05:32.810303 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="extract-content" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.810334 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="extract-content" Feb 18 12:05:32 crc kubenswrapper[4880]: E0218 12:05:32.810349 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="registry-server" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.810360 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="registry-server" Feb 18 12:05:32 crc kubenswrapper[4880]: E0218 12:05:32.810374 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="extract-utilities" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.810382 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="extract-utilities" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.810547 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b71544d-3f96-4d18-b802-95d94280b841" containerName="registry-server" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.813066 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.815042 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh"] Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.815424 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.816038 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q78hr" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.816140 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.816226 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.817959 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.838146 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh"] Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.906858 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-n64cr"] Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.908270 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-n64cr" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.914804 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.919063 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.919077 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zhnf4" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.920236 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.920691 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-5h66c"] Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.921710 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.923865 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.940669 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-5h66c"] Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.990904 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-conf\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.990964 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-sockets\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.990988 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-reloader\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.991157 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqz5k\" (UniqueName: \"kubernetes.io/projected/08be2c36-3a51-4cee-a9aa-f2140d689e95-kube-api-access-bqz5k\") pod \"frr-k8s-webhook-server-78b44bf5bb-5fmhh\" (UID: \"08be2c36-3a51-4cee-a9aa-f2140d689e95\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.991218 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fngn\" (UniqueName: \"kubernetes.io/projected/254a0886-f36a-4066-a0e4-3b35f835d81c-kube-api-access-2fngn\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.991347 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/254a0886-f36a-4066-a0e4-3b35f835d81c-metrics-certs\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.991429 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-metrics\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.991540 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-startup\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:32 crc kubenswrapper[4880]: I0218 12:05:32.991580 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08be2c36-3a51-4cee-a9aa-f2140d689e95-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5fmhh\" (UID: \"08be2c36-3a51-4cee-a9aa-f2140d689e95\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092664 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-metrics\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092747 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-metrics-certs\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092769 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-startup\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092790 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ea27bdd-ddfd-4c67-a386-cf768216df4a-cert\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092810 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08be2c36-3a51-4cee-a9aa-f2140d689e95-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5fmhh\" (UID: \"08be2c36-3a51-4cee-a9aa-f2140d689e95\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092833 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnrjp\" (UniqueName: \"kubernetes.io/projected/6ea27bdd-ddfd-4c67-a386-cf768216df4a-kube-api-access-tnrjp\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092858 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-conf\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092879 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-sockets\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092896 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2c2514b7-fa9f-4be3-8eef-4558bb35093e-metallb-excludel2\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092913 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-reloader\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092949 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqz5k\" (UniqueName: \"kubernetes.io/projected/08be2c36-3a51-4cee-a9aa-f2140d689e95-kube-api-access-bqz5k\") pod \"frr-k8s-webhook-server-78b44bf5bb-5fmhh\" (UID: \"08be2c36-3a51-4cee-a9aa-f2140d689e95\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092966 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-memberlist\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.092987 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fngn\" (UniqueName: \"kubernetes.io/projected/254a0886-f36a-4066-a0e4-3b35f835d81c-kube-api-access-2fngn\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093025 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ea27bdd-ddfd-4c67-a386-cf768216df4a-metrics-certs\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093046 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/254a0886-f36a-4066-a0e4-3b35f835d81c-metrics-certs\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093079 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65fbc\" (UniqueName: \"kubernetes.io/projected/2c2514b7-fa9f-4be3-8eef-4558bb35093e-kube-api-access-65fbc\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093174 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-metrics\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093485 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-conf\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093817 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-sockets\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093847 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/254a0886-f36a-4066-a0e4-3b35f835d81c-frr-startup\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.093868 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/254a0886-f36a-4066-a0e4-3b35f835d81c-reloader\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.109083 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/254a0886-f36a-4066-a0e4-3b35f835d81c-metrics-certs\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.109493 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08be2c36-3a51-4cee-a9aa-f2140d689e95-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5fmhh\" (UID: \"08be2c36-3a51-4cee-a9aa-f2140d689e95\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.113098 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fngn\" (UniqueName: \"kubernetes.io/projected/254a0886-f36a-4066-a0e4-3b35f835d81c-kube-api-access-2fngn\") pod \"frr-k8s-fhz9j\" (UID: \"254a0886-f36a-4066-a0e4-3b35f835d81c\") " pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.113372 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqz5k\" (UniqueName: \"kubernetes.io/projected/08be2c36-3a51-4cee-a9aa-f2140d689e95-kube-api-access-bqz5k\") pod \"frr-k8s-webhook-server-78b44bf5bb-5fmhh\" (UID: \"08be2c36-3a51-4cee-a9aa-f2140d689e95\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.135370 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q78hr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.142768 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.142948 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.194396 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ea27bdd-ddfd-4c67-a386-cf768216df4a-metrics-certs\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.194470 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65fbc\" (UniqueName: \"kubernetes.io/projected/2c2514b7-fa9f-4be3-8eef-4558bb35093e-kube-api-access-65fbc\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.194511 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-metrics-certs\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.194529 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ea27bdd-ddfd-4c67-a386-cf768216df4a-cert\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.194555 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnrjp\" (UniqueName: \"kubernetes.io/projected/6ea27bdd-ddfd-4c67-a386-cf768216df4a-kube-api-access-tnrjp\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.194586 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2c2514b7-fa9f-4be3-8eef-4558bb35093e-metallb-excludel2\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.194643 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-memberlist\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.200093 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.200126 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.200252 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.200299 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.200127 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 18 12:05:33 crc kubenswrapper[4880]: E0218 12:05:33.205785 4880 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 12:05:33 crc kubenswrapper[4880]: E0218 12:05:33.205889 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-memberlist podName:2c2514b7-fa9f-4be3-8eef-4558bb35093e nodeName:}" failed. No retries permitted until 2026-02-18 12:05:33.705857198 +0000 UTC m=+841.134758059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-memberlist") pod "speaker-n64cr" (UID: "2c2514b7-fa9f-4be3-8eef-4558bb35093e") : secret "metallb-memberlist" not found Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.206885 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2c2514b7-fa9f-4be3-8eef-4558bb35093e-metallb-excludel2\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.210296 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-metrics-certs\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.210540 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ea27bdd-ddfd-4c67-a386-cf768216df4a-metrics-certs\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.211878 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ea27bdd-ddfd-4c67-a386-cf768216df4a-cert\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.214070 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnrjp\" (UniqueName: \"kubernetes.io/projected/6ea27bdd-ddfd-4c67-a386-cf768216df4a-kube-api-access-tnrjp\") pod \"controller-69bbfbf88f-5h66c\" (UID: \"6ea27bdd-ddfd-4c67-a386-cf768216df4a\") " pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.214386 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65fbc\" (UniqueName: \"kubernetes.io/projected/2c2514b7-fa9f-4be3-8eef-4558bb35093e-kube-api-access-65fbc\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.233376 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.487813 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-5h66c"] Feb 18 12:05:33 crc kubenswrapper[4880]: W0218 12:05:33.490561 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ea27bdd_ddfd_4c67_a386_cf768216df4a.slice/crio-ad8dc609a17602722f0dbc62996b5640cc43adb709dc20978b2d49a931ca8699 WatchSource:0}: Error finding container ad8dc609a17602722f0dbc62996b5640cc43adb709dc20978b2d49a931ca8699: Status 404 returned error can't find the container with id ad8dc609a17602722f0dbc62996b5640cc43adb709dc20978b2d49a931ca8699 Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.529765 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5h66c" event={"ID":"6ea27bdd-ddfd-4c67-a386-cf768216df4a","Type":"ContainerStarted","Data":"ad8dc609a17602722f0dbc62996b5640cc43adb709dc20978b2d49a931ca8699"} Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.532439 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerStarted","Data":"e198e5a7785130440e4e5cfb16b445ffd0330591503911ff0fdc68d6686be165"} Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.651715 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh"] Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.803585 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-memberlist\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.809568 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2c2514b7-fa9f-4be3-8eef-4558bb35093e-memberlist\") pod \"speaker-n64cr\" (UID: \"2c2514b7-fa9f-4be3-8eef-4558bb35093e\") " pod="metallb-system/speaker-n64cr" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.825748 4880 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zhnf4" Feb 18 12:05:33 crc kubenswrapper[4880]: I0218 12:05:33.832705 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-n64cr" Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.542680 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n64cr" event={"ID":"2c2514b7-fa9f-4be3-8eef-4558bb35093e","Type":"ContainerStarted","Data":"e86b550e0ace52c3630e1747696d81556f8fc9e8a49c1362a87eea8cc34e2d5b"} Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.543086 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n64cr" event={"ID":"2c2514b7-fa9f-4be3-8eef-4558bb35093e","Type":"ContainerStarted","Data":"5bc1b0df5ee41fc31096886b1972f3b397cb497e053e7b29af3a27382e26a294"} Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.543106 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n64cr" event={"ID":"2c2514b7-fa9f-4be3-8eef-4558bb35093e","Type":"ContainerStarted","Data":"e27af3df1ac240d344dfa5eff75b29de95e8761634d9ec18edc3e686b1fb8383"} Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.544349 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-n64cr" Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.547307 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" event={"ID":"08be2c36-3a51-4cee-a9aa-f2140d689e95","Type":"ContainerStarted","Data":"cd712b40786b1997017c66156b6eafc70c24aed184eadf447c8eac534da89fb2"} Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.549806 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5h66c" event={"ID":"6ea27bdd-ddfd-4c67-a386-cf768216df4a","Type":"ContainerStarted","Data":"2a0610f3ab0bac3d55877f21c72af017c36c507ca5a29f7e5ca39094af9936fd"} Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.549854 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5h66c" event={"ID":"6ea27bdd-ddfd-4c67-a386-cf768216df4a","Type":"ContainerStarted","Data":"2d972aeb6b4cd527c4ff614d53eaeb5109685861dffb516fbc193ca34634c33c"} Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.550412 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.572785 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-n64cr" podStartSLOduration=2.5727560130000002 podStartE2EDuration="2.572756013s" podCreationTimestamp="2026-02-18 12:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:05:34.564864563 +0000 UTC m=+841.993765424" watchObservedRunningTime="2026-02-18 12:05:34.572756013 +0000 UTC m=+842.001656874" Feb 18 12:05:34 crc kubenswrapper[4880]: I0218 12:05:34.592164 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-5h66c" podStartSLOduration=2.592138382 podStartE2EDuration="2.592138382s" podCreationTimestamp="2026-02-18 12:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:05:34.586357801 +0000 UTC m=+842.015258682" watchObservedRunningTime="2026-02-18 12:05:34.592138382 +0000 UTC m=+842.021039243" Feb 18 12:05:41 crc kubenswrapper[4880]: I0218 12:05:41.622863 4880 generic.go:334] "Generic (PLEG): container finished" podID="254a0886-f36a-4066-a0e4-3b35f835d81c" containerID="d52ab88c48a8314250ead9de40975fc979e654bfd142c497706502f35e5ae65c" exitCode=0 Feb 18 12:05:41 crc kubenswrapper[4880]: I0218 12:05:41.622949 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerDied","Data":"d52ab88c48a8314250ead9de40975fc979e654bfd142c497706502f35e5ae65c"} Feb 18 12:05:41 crc kubenswrapper[4880]: I0218 12:05:41.624944 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" event={"ID":"08be2c36-3a51-4cee-a9aa-f2140d689e95","Type":"ContainerStarted","Data":"2d6004c4028a8784809d6c8764028b743c507cdcc5048dd16b4a19bb05ba0b84"} Feb 18 12:05:41 crc kubenswrapper[4880]: I0218 12:05:41.625152 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:41 crc kubenswrapper[4880]: I0218 12:05:41.664394 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" podStartSLOduration=2.799159379 podStartE2EDuration="9.664377228s" podCreationTimestamp="2026-02-18 12:05:32 +0000 UTC" firstStartedPulling="2026-02-18 12:05:33.66198302 +0000 UTC m=+841.090883901" lastFinishedPulling="2026-02-18 12:05:40.527200889 +0000 UTC m=+847.956101750" observedRunningTime="2026-02-18 12:05:41.662058747 +0000 UTC m=+849.090959618" watchObservedRunningTime="2026-02-18 12:05:41.664377228 +0000 UTC m=+849.093278089" Feb 18 12:05:42 crc kubenswrapper[4880]: I0218 12:05:42.635093 4880 generic.go:334] "Generic (PLEG): container finished" podID="254a0886-f36a-4066-a0e4-3b35f835d81c" containerID="8d44c3b9a8a54695105ac9529ea5e5a9a69b64eeafbc4da616f358e6808043a6" exitCode=0 Feb 18 12:05:42 crc kubenswrapper[4880]: I0218 12:05:42.635195 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerDied","Data":"8d44c3b9a8a54695105ac9529ea5e5a9a69b64eeafbc4da616f358e6808043a6"} Feb 18 12:05:43 crc kubenswrapper[4880]: I0218 12:05:43.238419 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-5h66c" Feb 18 12:05:43 crc kubenswrapper[4880]: I0218 12:05:43.643122 4880 generic.go:334] "Generic (PLEG): container finished" podID="254a0886-f36a-4066-a0e4-3b35f835d81c" containerID="96af24a171b5ebff8c3a4e056a1559d886b55ae41ebb11cceaf82c0fd80c78a1" exitCode=0 Feb 18 12:05:43 crc kubenswrapper[4880]: I0218 12:05:43.643178 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerDied","Data":"96af24a171b5ebff8c3a4e056a1559d886b55ae41ebb11cceaf82c0fd80c78a1"} Feb 18 12:05:44 crc kubenswrapper[4880]: I0218 12:05:44.654916 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerStarted","Data":"87a589979cf10d370d60770b8ec921c543fdd0383f6011910497d96d34889a30"} Feb 18 12:05:44 crc kubenswrapper[4880]: I0218 12:05:44.655340 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerStarted","Data":"666e8183dae399a8071a4eac7dc977d56e838bb979b2321e0259742834560bc2"} Feb 18 12:05:44 crc kubenswrapper[4880]: I0218 12:05:44.655358 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerStarted","Data":"89a53310b5cc132d24a4f01b23cce441694e2029a6d9f51e60d1acb4b65e4c6e"} Feb 18 12:05:44 crc kubenswrapper[4880]: I0218 12:05:44.655408 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerStarted","Data":"c309bd62b8d893beda638e5e20103496a278edf294bd3e25dce54d374bc77644"} Feb 18 12:05:44 crc kubenswrapper[4880]: I0218 12:05:44.655424 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerStarted","Data":"fb6ad13557bde963c611f270ccebd88e1f619a6df54292676f69fc15f749706d"} Feb 18 12:05:44 crc kubenswrapper[4880]: I0218 12:05:44.655434 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fhz9j" event={"ID":"254a0886-f36a-4066-a0e4-3b35f835d81c","Type":"ContainerStarted","Data":"1cb0674f961754de943758f98992bcca434c3fe0ca59fbfef7fcb3818119bc35"} Feb 18 12:05:44 crc kubenswrapper[4880]: I0218 12:05:44.682765 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fhz9j" podStartSLOduration=5.550365204 podStartE2EDuration="12.682743496s" podCreationTimestamp="2026-02-18 12:05:32 +0000 UTC" firstStartedPulling="2026-02-18 12:05:33.377376897 +0000 UTC m=+840.806277768" lastFinishedPulling="2026-02-18 12:05:40.509755199 +0000 UTC m=+847.938656060" observedRunningTime="2026-02-18 12:05:44.680505737 +0000 UTC m=+852.109406598" watchObservedRunningTime="2026-02-18 12:05:44.682743496 +0000 UTC m=+852.111644357" Feb 18 12:05:45 crc kubenswrapper[4880]: I0218 12:05:45.661792 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:48 crc kubenswrapper[4880]: I0218 12:05:48.143758 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:48 crc kubenswrapper[4880]: I0218 12:05:48.187496 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:53 crc kubenswrapper[4880]: I0218 12:05:53.146829 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fhz9j" Feb 18 12:05:53 crc kubenswrapper[4880]: I0218 12:05:53.150992 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5fmhh" Feb 18 12:05:53 crc kubenswrapper[4880]: I0218 12:05:53.836338 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-n64cr" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.508411 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-smvck"] Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.509712 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-smvck" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.514336 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vsx8x" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.514541 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.515407 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.567382 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-smvck"] Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.669866 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbdl\" (UniqueName: \"kubernetes.io/projected/ce8443b2-0fd6-4382-8226-b5d3eb47ec4b-kube-api-access-jwbdl\") pod \"openstack-operator-index-smvck\" (UID: \"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b\") " pod="openstack-operators/openstack-operator-index-smvck" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.771497 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbdl\" (UniqueName: \"kubernetes.io/projected/ce8443b2-0fd6-4382-8226-b5d3eb47ec4b-kube-api-access-jwbdl\") pod \"openstack-operator-index-smvck\" (UID: \"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b\") " pod="openstack-operators/openstack-operator-index-smvck" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.793575 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbdl\" (UniqueName: \"kubernetes.io/projected/ce8443b2-0fd6-4382-8226-b5d3eb47ec4b-kube-api-access-jwbdl\") pod \"openstack-operator-index-smvck\" (UID: \"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b\") " pod="openstack-operators/openstack-operator-index-smvck" Feb 18 12:05:56 crc kubenswrapper[4880]: I0218 12:05:56.895576 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-smvck" Feb 18 12:05:57 crc kubenswrapper[4880]: I0218 12:05:57.326515 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-smvck"] Feb 18 12:05:57 crc kubenswrapper[4880]: W0218 12:05:57.329108 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce8443b2_0fd6_4382_8226_b5d3eb47ec4b.slice/crio-d4457f9f64e936e9fac880141d9ecc8015d4c867ceec3b34e5d796cab2180bc2 WatchSource:0}: Error finding container d4457f9f64e936e9fac880141d9ecc8015d4c867ceec3b34e5d796cab2180bc2: Status 404 returned error can't find the container with id d4457f9f64e936e9fac880141d9ecc8015d4c867ceec3b34e5d796cab2180bc2 Feb 18 12:05:57 crc kubenswrapper[4880]: I0218 12:05:57.740793 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-smvck" event={"ID":"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b","Type":"ContainerStarted","Data":"d4457f9f64e936e9fac880141d9ecc8015d4c867ceec3b34e5d796cab2180bc2"} Feb 18 12:05:59 crc kubenswrapper[4880]: I0218 12:05:59.753070 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-smvck" event={"ID":"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b","Type":"ContainerStarted","Data":"5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda"} Feb 18 12:05:59 crc kubenswrapper[4880]: I0218 12:05:59.774468 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-smvck" podStartSLOduration=1.700114387 podStartE2EDuration="3.774447445s" podCreationTimestamp="2026-02-18 12:05:56 +0000 UTC" firstStartedPulling="2026-02-18 12:05:57.332053043 +0000 UTC m=+864.760953904" lastFinishedPulling="2026-02-18 12:05:59.406386101 +0000 UTC m=+866.835286962" observedRunningTime="2026-02-18 12:05:59.768281014 +0000 UTC m=+867.197181885" watchObservedRunningTime="2026-02-18 12:05:59.774447445 +0000 UTC m=+867.203348306" Feb 18 12:05:59 crc kubenswrapper[4880]: I0218 12:05:59.865543 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-smvck"] Feb 18 12:06:00 crc kubenswrapper[4880]: I0218 12:06:00.471091 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-x2nkq"] Feb 18 12:06:00 crc kubenswrapper[4880]: I0218 12:06:00.472254 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:00 crc kubenswrapper[4880]: I0218 12:06:00.482542 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x2nkq"] Feb 18 12:06:00 crc kubenswrapper[4880]: I0218 12:06:00.537130 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzbx\" (UniqueName: \"kubernetes.io/projected/1a2e5b16-76c1-4ade-916d-0b4fd45b3acf-kube-api-access-llzbx\") pod \"openstack-operator-index-x2nkq\" (UID: \"1a2e5b16-76c1-4ade-916d-0b4fd45b3acf\") " pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:00 crc kubenswrapper[4880]: I0218 12:06:00.638220 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzbx\" (UniqueName: \"kubernetes.io/projected/1a2e5b16-76c1-4ade-916d-0b4fd45b3acf-kube-api-access-llzbx\") pod \"openstack-operator-index-x2nkq\" (UID: \"1a2e5b16-76c1-4ade-916d-0b4fd45b3acf\") " pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:00 crc kubenswrapper[4880]: I0218 12:06:00.660425 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzbx\" (UniqueName: \"kubernetes.io/projected/1a2e5b16-76c1-4ade-916d-0b4fd45b3acf-kube-api-access-llzbx\") pod \"openstack-operator-index-x2nkq\" (UID: \"1a2e5b16-76c1-4ade-916d-0b4fd45b3acf\") " pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:00 crc kubenswrapper[4880]: I0218 12:06:00.794976 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:01 crc kubenswrapper[4880]: I0218 12:06:01.215298 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x2nkq"] Feb 18 12:06:01 crc kubenswrapper[4880]: W0218 12:06:01.220281 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a2e5b16_76c1_4ade_916d_0b4fd45b3acf.slice/crio-fdcf45e762a761ebc2f52d623e8c86e00810272489a6949e71dfc13c3f3af75c WatchSource:0}: Error finding container fdcf45e762a761ebc2f52d623e8c86e00810272489a6949e71dfc13c3f3af75c: Status 404 returned error can't find the container with id fdcf45e762a761ebc2f52d623e8c86e00810272489a6949e71dfc13c3f3af75c Feb 18 12:06:01 crc kubenswrapper[4880]: I0218 12:06:01.772165 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x2nkq" event={"ID":"1a2e5b16-76c1-4ade-916d-0b4fd45b3acf","Type":"ContainerStarted","Data":"f0ce7d3226bce435c13d52481700108a976b2502d35636b71779dfde3963aafb"} Feb 18 12:06:01 crc kubenswrapper[4880]: I0218 12:06:01.773035 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x2nkq" event={"ID":"1a2e5b16-76c1-4ade-916d-0b4fd45b3acf","Type":"ContainerStarted","Data":"fdcf45e762a761ebc2f52d623e8c86e00810272489a6949e71dfc13c3f3af75c"} Feb 18 12:06:01 crc kubenswrapper[4880]: I0218 12:06:01.772322 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-smvck" podUID="ce8443b2-0fd6-4382-8226-b5d3eb47ec4b" containerName="registry-server" containerID="cri-o://5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda" gracePeriod=2 Feb 18 12:06:01 crc kubenswrapper[4880]: I0218 12:06:01.798705 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-x2nkq" podStartSLOduration=1.750497158 podStartE2EDuration="1.798676396s" podCreationTimestamp="2026-02-18 12:06:00 +0000 UTC" firstStartedPulling="2026-02-18 12:06:01.225402982 +0000 UTC m=+868.654303843" lastFinishedPulling="2026-02-18 12:06:01.27358222 +0000 UTC m=+868.702483081" observedRunningTime="2026-02-18 12:06:01.796747216 +0000 UTC m=+869.225648117" watchObservedRunningTime="2026-02-18 12:06:01.798676396 +0000 UTC m=+869.227577287" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.187370 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-smvck" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.374283 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwbdl\" (UniqueName: \"kubernetes.io/projected/ce8443b2-0fd6-4382-8226-b5d3eb47ec4b-kube-api-access-jwbdl\") pod \"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b\" (UID: \"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b\") " Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.458884 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8443b2-0fd6-4382-8226-b5d3eb47ec4b-kube-api-access-jwbdl" (OuterVolumeSpecName: "kube-api-access-jwbdl") pod "ce8443b2-0fd6-4382-8226-b5d3eb47ec4b" (UID: "ce8443b2-0fd6-4382-8226-b5d3eb47ec4b"). InnerVolumeSpecName "kube-api-access-jwbdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.476852 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwbdl\" (UniqueName: \"kubernetes.io/projected/ce8443b2-0fd6-4382-8226-b5d3eb47ec4b-kube-api-access-jwbdl\") on node \"crc\" DevicePath \"\"" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.781801 4880 generic.go:334] "Generic (PLEG): container finished" podID="ce8443b2-0fd6-4382-8226-b5d3eb47ec4b" containerID="5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda" exitCode=0 Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.781892 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-smvck" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.781955 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-smvck" event={"ID":"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b","Type":"ContainerDied","Data":"5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda"} Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.782033 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-smvck" event={"ID":"ce8443b2-0fd6-4382-8226-b5d3eb47ec4b","Type":"ContainerDied","Data":"d4457f9f64e936e9fac880141d9ecc8015d4c867ceec3b34e5d796cab2180bc2"} Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.782058 4880 scope.go:117] "RemoveContainer" containerID="5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.812510 4880 scope.go:117] "RemoveContainer" containerID="5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda" Feb 18 12:06:02 crc kubenswrapper[4880]: E0218 12:06:02.813219 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda\": container with ID starting with 5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda not found: ID does not exist" containerID="5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.813272 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda"} err="failed to get container status \"5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda\": rpc error: code = NotFound desc = could not find container \"5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda\": container with ID starting with 5452dbf5136b47d3156a890f2205ed2ea8af0d6027d0deb334000a822ca4cfda not found: ID does not exist" Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.815280 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-smvck"] Feb 18 12:06:02 crc kubenswrapper[4880]: I0218 12:06:02.820466 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-smvck"] Feb 18 12:06:03 crc kubenswrapper[4880]: I0218 12:06:03.189504 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8443b2-0fd6-4382-8226-b5d3eb47ec4b" path="/var/lib/kubelet/pods/ce8443b2-0fd6-4382-8226-b5d3eb47ec4b/volumes" Feb 18 12:06:10 crc kubenswrapper[4880]: I0218 12:06:10.795713 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:10 crc kubenswrapper[4880]: I0218 12:06:10.796781 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:10 crc kubenswrapper[4880]: I0218 12:06:10.838984 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:10 crc kubenswrapper[4880]: I0218 12:06:10.876914 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-x2nkq" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.126472 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp"] Feb 18 12:06:12 crc kubenswrapper[4880]: E0218 12:06:12.127585 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8443b2-0fd6-4382-8226-b5d3eb47ec4b" containerName="registry-server" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.127717 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8443b2-0fd6-4382-8226-b5d3eb47ec4b" containerName="registry-server" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.127942 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8443b2-0fd6-4382-8226-b5d3eb47ec4b" containerName="registry-server" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.129278 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.131505 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6r82j" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.149670 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp"] Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.150616 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-bundle\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.150758 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fr5\" (UniqueName: \"kubernetes.io/projected/390a7018-a85a-45ce-85e5-215b6b67a5f3-kube-api-access-f6fr5\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.150870 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-util\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.252107 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-bundle\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.252162 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fr5\" (UniqueName: \"kubernetes.io/projected/390a7018-a85a-45ce-85e5-215b6b67a5f3-kube-api-access-f6fr5\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.252184 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-util\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.253225 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-util\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.253305 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-bundle\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.274770 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fr5\" (UniqueName: \"kubernetes.io/projected/390a7018-a85a-45ce-85e5-215b6b67a5f3-kube-api-access-f6fr5\") pod \"0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.450131 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:12 crc kubenswrapper[4880]: I0218 12:06:12.878341 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp"] Feb 18 12:06:13 crc kubenswrapper[4880]: I0218 12:06:13.881946 4880 generic.go:334] "Generic (PLEG): container finished" podID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerID="5c0c2c585107cd35e385414f51f207a39160916d36b64d53502975a2fb3833d2" exitCode=0 Feb 18 12:06:13 crc kubenswrapper[4880]: I0218 12:06:13.882294 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" event={"ID":"390a7018-a85a-45ce-85e5-215b6b67a5f3","Type":"ContainerDied","Data":"5c0c2c585107cd35e385414f51f207a39160916d36b64d53502975a2fb3833d2"} Feb 18 12:06:13 crc kubenswrapper[4880]: I0218 12:06:13.882322 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" event={"ID":"390a7018-a85a-45ce-85e5-215b6b67a5f3","Type":"ContainerStarted","Data":"37a30bd7b87269296518950958c1844aa6514be4b3bd64d8284b87bfcb7f525f"} Feb 18 12:06:14 crc kubenswrapper[4880]: I0218 12:06:14.893050 4880 generic.go:334] "Generic (PLEG): container finished" podID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerID="eea9049b8591795b111d20be3161b38e83ccd59166181adaf930f47bba066342" exitCode=0 Feb 18 12:06:14 crc kubenswrapper[4880]: I0218 12:06:14.893164 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" event={"ID":"390a7018-a85a-45ce-85e5-215b6b67a5f3","Type":"ContainerDied","Data":"eea9049b8591795b111d20be3161b38e83ccd59166181adaf930f47bba066342"} Feb 18 12:06:15 crc kubenswrapper[4880]: I0218 12:06:15.902250 4880 generic.go:334] "Generic (PLEG): container finished" podID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerID="bfe8768f9fc5a6d85175535dbda7af9773b9773c57b7a0a07d4be1e3709e7187" exitCode=0 Feb 18 12:06:15 crc kubenswrapper[4880]: I0218 12:06:15.902347 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" event={"ID":"390a7018-a85a-45ce-85e5-215b6b67a5f3","Type":"ContainerDied","Data":"bfe8768f9fc5a6d85175535dbda7af9773b9773c57b7a0a07d4be1e3709e7187"} Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.169509 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.321898 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-bundle\") pod \"390a7018-a85a-45ce-85e5-215b6b67a5f3\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.321998 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-util\") pod \"390a7018-a85a-45ce-85e5-215b6b67a5f3\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.322024 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6fr5\" (UniqueName: \"kubernetes.io/projected/390a7018-a85a-45ce-85e5-215b6b67a5f3-kube-api-access-f6fr5\") pod \"390a7018-a85a-45ce-85e5-215b6b67a5f3\" (UID: \"390a7018-a85a-45ce-85e5-215b6b67a5f3\") " Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.322953 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-bundle" (OuterVolumeSpecName: "bundle") pod "390a7018-a85a-45ce-85e5-215b6b67a5f3" (UID: "390a7018-a85a-45ce-85e5-215b6b67a5f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.327460 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390a7018-a85a-45ce-85e5-215b6b67a5f3-kube-api-access-f6fr5" (OuterVolumeSpecName: "kube-api-access-f6fr5") pod "390a7018-a85a-45ce-85e5-215b6b67a5f3" (UID: "390a7018-a85a-45ce-85e5-215b6b67a5f3"). InnerVolumeSpecName "kube-api-access-f6fr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.336969 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-util" (OuterVolumeSpecName: "util") pod "390a7018-a85a-45ce-85e5-215b6b67a5f3" (UID: "390a7018-a85a-45ce-85e5-215b6b67a5f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.423728 4880 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-util\") on node \"crc\" DevicePath \"\"" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.423793 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6fr5\" (UniqueName: \"kubernetes.io/projected/390a7018-a85a-45ce-85e5-215b6b67a5f3-kube-api-access-f6fr5\") on node \"crc\" DevicePath \"\"" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.423805 4880 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390a7018-a85a-45ce-85e5-215b6b67a5f3-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.918012 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" event={"ID":"390a7018-a85a-45ce-85e5-215b6b67a5f3","Type":"ContainerDied","Data":"37a30bd7b87269296518950958c1844aa6514be4b3bd64d8284b87bfcb7f525f"} Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.918061 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a30bd7b87269296518950958c1844aa6514be4b3bd64d8284b87bfcb7f525f" Feb 18 12:06:17 crc kubenswrapper[4880]: I0218 12:06:17.918070 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.168349 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl"] Feb 18 12:06:24 crc kubenswrapper[4880]: E0218 12:06:24.169317 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerName="util" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.169333 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerName="util" Feb 18 12:06:24 crc kubenswrapper[4880]: E0218 12:06:24.169350 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerName="pull" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.169358 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerName="pull" Feb 18 12:06:24 crc kubenswrapper[4880]: E0218 12:06:24.169369 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerName="extract" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.169377 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerName="extract" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.169522 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="390a7018-a85a-45ce-85e5-215b6b67a5f3" containerName="extract" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.170019 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.175236 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-nmpxg" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.191511 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl"] Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.321544 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxlc\" (UniqueName: \"kubernetes.io/projected/1b74510a-df13-4112-a258-4d1d08b5258e-kube-api-access-6xxlc\") pod \"openstack-operator-controller-init-795b869f54-7wrfl\" (UID: \"1b74510a-df13-4112-a258-4d1d08b5258e\") " pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.422418 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxlc\" (UniqueName: \"kubernetes.io/projected/1b74510a-df13-4112-a258-4d1d08b5258e-kube-api-access-6xxlc\") pod \"openstack-operator-controller-init-795b869f54-7wrfl\" (UID: \"1b74510a-df13-4112-a258-4d1d08b5258e\") " pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.448385 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxlc\" (UniqueName: \"kubernetes.io/projected/1b74510a-df13-4112-a258-4d1d08b5258e-kube-api-access-6xxlc\") pod \"openstack-operator-controller-init-795b869f54-7wrfl\" (UID: \"1b74510a-df13-4112-a258-4d1d08b5258e\") " pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.488168 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.751051 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl"] Feb 18 12:06:24 crc kubenswrapper[4880]: W0218 12:06:24.756631 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b74510a_df13_4112_a258_4d1d08b5258e.slice/crio-4fff73d0858d41e4ac2d6eeebdce16e6f63477c83516a93e5c2e219a1029e913 WatchSource:0}: Error finding container 4fff73d0858d41e4ac2d6eeebdce16e6f63477c83516a93e5c2e219a1029e913: Status 404 returned error can't find the container with id 4fff73d0858d41e4ac2d6eeebdce16e6f63477c83516a93e5c2e219a1029e913 Feb 18 12:06:24 crc kubenswrapper[4880]: I0218 12:06:24.966504 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" event={"ID":"1b74510a-df13-4112-a258-4d1d08b5258e","Type":"ContainerStarted","Data":"4fff73d0858d41e4ac2d6eeebdce16e6f63477c83516a93e5c2e219a1029e913"} Feb 18 12:06:29 crc kubenswrapper[4880]: I0218 12:06:28.999519 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" event={"ID":"1b74510a-df13-4112-a258-4d1d08b5258e","Type":"ContainerStarted","Data":"ccd4f2a96fe327f2b1c6be4f894b10b2f57dbe426bfbe3d96a78f95b8b4784c7"} Feb 18 12:06:29 crc kubenswrapper[4880]: I0218 12:06:29.000362 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" Feb 18 12:06:29 crc kubenswrapper[4880]: I0218 12:06:29.059528 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" podStartSLOduration=1.372688725 podStartE2EDuration="5.05950671s" podCreationTimestamp="2026-02-18 12:06:24 +0000 UTC" firstStartedPulling="2026-02-18 12:06:24.759218414 +0000 UTC m=+892.188119275" lastFinishedPulling="2026-02-18 12:06:28.446036399 +0000 UTC m=+895.874937260" observedRunningTime="2026-02-18 12:06:29.056033058 +0000 UTC m=+896.484933929" watchObservedRunningTime="2026-02-18 12:06:29.05950671 +0000 UTC m=+896.488407571" Feb 18 12:06:34 crc kubenswrapper[4880]: I0218 12:06:34.494589 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-795b869f54-7wrfl" Feb 18 12:06:53 crc kubenswrapper[4880]: I0218 12:06:53.274823 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:06:53 crc kubenswrapper[4880]: I0218 12:06:53.275795 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:06:59 crc kubenswrapper[4880]: I0218 12:06:59.975158 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9"] Feb 18 12:06:59 crc kubenswrapper[4880]: I0218 12:06:59.978890 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" Feb 18 12:06:59 crc kubenswrapper[4880]: I0218 12:06:59.982342 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-tn54l" Feb 18 12:06:59 crc kubenswrapper[4880]: I0218 12:06:59.987408 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg"] Feb 18 12:06:59 crc kubenswrapper[4880]: I0218 12:06:59.988560 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" Feb 18 12:06:59 crc kubenswrapper[4880]: I0218 12:06:59.990732 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-pgd4v" Feb 18 12:06:59 crc kubenswrapper[4880]: I0218 12:06:59.992087 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.015979 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.020072 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.029073 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ncwtc" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.029424 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.043088 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.094485 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.095460 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.099566 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rb5hd" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.107474 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.108431 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.110029 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lpvv6" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.130898 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.150784 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.152807 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k96hc\" (UniqueName: \"kubernetes.io/projected/fd00b294-1b81-4988-bf25-d5681534e3f1-kube-api-access-k96hc\") pod \"designate-operator-controller-manager-6d8bf5c495-vjrxb\" (UID: \"fd00b294-1b81-4988-bf25-d5681534e3f1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.152905 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9h2m\" (UniqueName: \"kubernetes.io/projected/1d6b527e-fad8-4853-a0ab-d9fa69c934ff-kube-api-access-d9h2m\") pod \"cinder-operator-controller-manager-5d946d989d-bmrlg\" (UID: \"1d6b527e-fad8-4853-a0ab-d9fa69c934ff\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.152957 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzt9\" (UniqueName: \"kubernetes.io/projected/cb351566-0fe8-4583-86c9-cd123efb0db6-kube-api-access-tpzt9\") pod \"barbican-operator-controller-manager-868647ff47-t4mk9\" (UID: \"cb351566-0fe8-4583-86c9-cd123efb0db6\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.184832 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.185919 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.195208 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rrbqp" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.198189 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.211109 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-k2psm"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.212007 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.213373 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.217329 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.218459 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.220005 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-h9plc" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.221890 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-k2psm"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.237525 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-ps8cj" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.238689 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.247867 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.248939 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.258904 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-j5fq7" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.260642 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k96hc\" (UniqueName: \"kubernetes.io/projected/fd00b294-1b81-4988-bf25-d5681534e3f1-kube-api-access-k96hc\") pod \"designate-operator-controller-manager-6d8bf5c495-vjrxb\" (UID: \"fd00b294-1b81-4988-bf25-d5681534e3f1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.260726 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn6wt\" (UniqueName: \"kubernetes.io/projected/479cc862-303e-4c55-9818-4c4cd04a5987-kube-api-access-mn6wt\") pod \"heat-operator-controller-manager-69f49c598c-4pvlt\" (UID: \"479cc862-303e-4c55-9818-4c4cd04a5987\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.260840 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9h2m\" (UniqueName: \"kubernetes.io/projected/1d6b527e-fad8-4853-a0ab-d9fa69c934ff-kube-api-access-d9h2m\") pod \"cinder-operator-controller-manager-5d946d989d-bmrlg\" (UID: \"1d6b527e-fad8-4853-a0ab-d9fa69c934ff\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.260895 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzt9\" (UniqueName: \"kubernetes.io/projected/cb351566-0fe8-4583-86c9-cd123efb0db6-kube-api-access-tpzt9\") pod \"barbican-operator-controller-manager-868647ff47-t4mk9\" (UID: \"cb351566-0fe8-4583-86c9-cd123efb0db6\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.260997 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbrh\" (UniqueName: \"kubernetes.io/projected/7975dd71-7d05-400c-9058-c76c811d0bb0-kube-api-access-7bbrh\") pod \"glance-operator-controller-manager-77987464f4-mcs6b\" (UID: \"7975dd71-7d05-400c-9058-c76c811d0bb0\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.283689 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.284790 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.296485 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-qsv5h" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.300008 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.308330 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9h2m\" (UniqueName: \"kubernetes.io/projected/1d6b527e-fad8-4853-a0ab-d9fa69c934ff-kube-api-access-d9h2m\") pod \"cinder-operator-controller-manager-5d946d989d-bmrlg\" (UID: \"1d6b527e-fad8-4853-a0ab-d9fa69c934ff\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.312505 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzt9\" (UniqueName: \"kubernetes.io/projected/cb351566-0fe8-4583-86c9-cd123efb0db6-kube-api-access-tpzt9\") pod \"barbican-operator-controller-manager-868647ff47-t4mk9\" (UID: \"cb351566-0fe8-4583-86c9-cd123efb0db6\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.330906 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k96hc\" (UniqueName: \"kubernetes.io/projected/fd00b294-1b81-4988-bf25-d5681534e3f1-kube-api-access-k96hc\") pod \"designate-operator-controller-manager-6d8bf5c495-vjrxb\" (UID: \"fd00b294-1b81-4988-bf25-d5681534e3f1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.331002 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.331324 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.340204 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.341131 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.344868 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-g65wt" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.358251 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.366954 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.367005 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98snz\" (UniqueName: \"kubernetes.io/projected/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-kube-api-access-98snz\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.367039 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbrh\" (UniqueName: \"kubernetes.io/projected/7975dd71-7d05-400c-9058-c76c811d0bb0-kube-api-access-7bbrh\") pod \"glance-operator-controller-manager-77987464f4-mcs6b\" (UID: \"7975dd71-7d05-400c-9058-c76c811d0bb0\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.367057 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxnxq\" (UniqueName: \"kubernetes.io/projected/5d4d2d05-77bc-4e37-8450-b2281aaa468a-kube-api-access-xxnxq\") pod \"keystone-operator-controller-manager-b4d948c87-6wl45\" (UID: \"5d4d2d05-77bc-4e37-8450-b2281aaa468a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.367095 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn6wt\" (UniqueName: \"kubernetes.io/projected/479cc862-303e-4c55-9818-4c4cd04a5987-kube-api-access-mn6wt\") pod \"heat-operator-controller-manager-69f49c598c-4pvlt\" (UID: \"479cc862-303e-4c55-9818-4c4cd04a5987\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.367129 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6cd\" (UniqueName: \"kubernetes.io/projected/bf26df20-6bff-4038-bfcb-c87b39c061ed-kube-api-access-sd6cd\") pod \"manila-operator-controller-manager-54f6768c69-kv9jb\" (UID: \"bf26df20-6bff-4038-bfcb-c87b39c061ed\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.367419 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kjg\" (UniqueName: \"kubernetes.io/projected/df1f8ad9-7e58-4e54-b189-d994806814bf-kube-api-access-k2kjg\") pod \"horizon-operator-controller-manager-5b9b8895d5-vkgzq\" (UID: \"df1f8ad9-7e58-4e54-b189-d994806814bf\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.367455 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nn7h\" (UniqueName: \"kubernetes.io/projected/bfb70827-0634-4721-8cac-3d1a0e4153a6-kube-api-access-8nn7h\") pod \"ironic-operator-controller-manager-554564d7fc-kwz6h\" (UID: \"bfb70827-0634-4721-8cac-3d1a0e4153a6\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.382852 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.413342 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.414297 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.419365 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8ftcc" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.425175 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.426072 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.428900 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-v4kjw" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.452726 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn6wt\" (UniqueName: \"kubernetes.io/projected/479cc862-303e-4c55-9818-4c4cd04a5987-kube-api-access-mn6wt\") pod \"heat-operator-controller-manager-69f49c598c-4pvlt\" (UID: \"479cc862-303e-4c55-9818-4c4cd04a5987\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.453025 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.453085 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbrh\" (UniqueName: \"kubernetes.io/projected/7975dd71-7d05-400c-9058-c76c811d0bb0-kube-api-access-7bbrh\") pod \"glance-operator-controller-manager-77987464f4-mcs6b\" (UID: \"7975dd71-7d05-400c-9058-c76c811d0bb0\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.456484 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.464087 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.465339 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.468786 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-h87f7" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.469816 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6cd\" (UniqueName: \"kubernetes.io/projected/bf26df20-6bff-4038-bfcb-c87b39c061ed-kube-api-access-sd6cd\") pod \"manila-operator-controller-manager-54f6768c69-kv9jb\" (UID: \"bf26df20-6bff-4038-bfcb-c87b39c061ed\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.469865 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kjg\" (UniqueName: \"kubernetes.io/projected/df1f8ad9-7e58-4e54-b189-d994806814bf-kube-api-access-k2kjg\") pod \"horizon-operator-controller-manager-5b9b8895d5-vkgzq\" (UID: \"df1f8ad9-7e58-4e54-b189-d994806814bf\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.469899 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nn7h\" (UniqueName: \"kubernetes.io/projected/bfb70827-0634-4721-8cac-3d1a0e4153a6-kube-api-access-8nn7h\") pod \"ironic-operator-controller-manager-554564d7fc-kwz6h\" (UID: \"bfb70827-0634-4721-8cac-3d1a0e4153a6\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.469925 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpg5x\" (UniqueName: \"kubernetes.io/projected/58de4ed7-c54f-4970-99ab-dae7521db195-kube-api-access-tpg5x\") pod \"mariadb-operator-controller-manager-6994f66f48-2r47l\" (UID: \"58de4ed7-c54f-4970-99ab-dae7521db195\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.469952 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.471484 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98snz\" (UniqueName: \"kubernetes.io/projected/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-kube-api-access-98snz\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.471570 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxnxq\" (UniqueName: \"kubernetes.io/projected/5d4d2d05-77bc-4e37-8450-b2281aaa468a-kube-api-access-xxnxq\") pod \"keystone-operator-controller-manager-b4d948c87-6wl45\" (UID: \"5d4d2d05-77bc-4e37-8450-b2281aaa468a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" Feb 18 12:07:00 crc kubenswrapper[4880]: E0218 12:07:00.472678 4880 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:00 crc kubenswrapper[4880]: E0218 12:07:00.472747 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert podName:eaad9a94-37d2-4df9-ad82-1728dde9a0c4 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:00.972721066 +0000 UTC m=+928.401621927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert") pod "infra-operator-controller-manager-79d975b745-k2psm" (UID: "eaad9a94-37d2-4df9-ad82-1728dde9a0c4") : secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.500782 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.506398 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6cd\" (UniqueName: \"kubernetes.io/projected/bf26df20-6bff-4038-bfcb-c87b39c061ed-kube-api-access-sd6cd\") pod \"manila-operator-controller-manager-54f6768c69-kv9jb\" (UID: \"bf26df20-6bff-4038-bfcb-c87b39c061ed\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.511927 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nn7h\" (UniqueName: \"kubernetes.io/projected/bfb70827-0634-4721-8cac-3d1a0e4153a6-kube-api-access-8nn7h\") pod \"ironic-operator-controller-manager-554564d7fc-kwz6h\" (UID: \"bfb70827-0634-4721-8cac-3d1a0e4153a6\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.513216 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98snz\" (UniqueName: \"kubernetes.io/projected/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-kube-api-access-98snz\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.517479 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kjg\" (UniqueName: \"kubernetes.io/projected/df1f8ad9-7e58-4e54-b189-d994806814bf-kube-api-access-k2kjg\") pod \"horizon-operator-controller-manager-5b9b8895d5-vkgzq\" (UID: \"df1f8ad9-7e58-4e54-b189-d994806814bf\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.522836 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxnxq\" (UniqueName: \"kubernetes.io/projected/5d4d2d05-77bc-4e37-8450-b2281aaa468a-kube-api-access-xxnxq\") pod \"keystone-operator-controller-manager-b4d948c87-6wl45\" (UID: \"5d4d2d05-77bc-4e37-8450-b2281aaa468a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.538332 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.538820 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.540512 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.560024 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.560566 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ch6sf" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.590704 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kbgq\" (UniqueName: \"kubernetes.io/projected/269f6148-d710-4a17-9180-0258afb04709-kube-api-access-5kbgq\") pod \"neutron-operator-controller-manager-64ddbf8bb-ghxp2\" (UID: \"269f6148-d710-4a17-9180-0258afb04709\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.590762 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx8gr\" (UniqueName: \"kubernetes.io/projected/0cca4cb1-dbd6-45f5-b888-34735df9ca51-kube-api-access-mx8gr\") pod \"octavia-operator-controller-manager-69f8888797-bgw24\" (UID: \"0cca4cb1-dbd6-45f5-b888-34735df9ca51\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.590826 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glxl7\" (UniqueName: \"kubernetes.io/projected/784b9cd8-45d9-4167-b857-edc06f4ce473-kube-api-access-glxl7\") pod \"nova-operator-controller-manager-567668f5cf-9bx28\" (UID: \"784b9cd8-45d9-4167-b857-edc06f4ce473\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.590892 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpg5x\" (UniqueName: \"kubernetes.io/projected/58de4ed7-c54f-4970-99ab-dae7521db195-kube-api-access-tpg5x\") pod \"mariadb-operator-controller-manager-6994f66f48-2r47l\" (UID: \"58de4ed7-c54f-4970-99ab-dae7521db195\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.606796 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.614135 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.623073 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpg5x\" (UniqueName: \"kubernetes.io/projected/58de4ed7-c54f-4970-99ab-dae7521db195-kube-api-access-tpg5x\") pod \"mariadb-operator-controller-manager-6994f66f48-2r47l\" (UID: \"58de4ed7-c54f-4970-99ab-dae7521db195\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.638398 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.646460 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.650223 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hfzd9" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.672183 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.672249 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.694684 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4ksb\" (UniqueName: \"kubernetes.io/projected/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-kube-api-access-n4ksb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.694758 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glxl7\" (UniqueName: \"kubernetes.io/projected/784b9cd8-45d9-4167-b857-edc06f4ce473-kube-api-access-glxl7\") pod \"nova-operator-controller-manager-567668f5cf-9bx28\" (UID: \"784b9cd8-45d9-4167-b857-edc06f4ce473\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.694908 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kbgq\" (UniqueName: \"kubernetes.io/projected/269f6148-d710-4a17-9180-0258afb04709-kube-api-access-5kbgq\") pod \"neutron-operator-controller-manager-64ddbf8bb-ghxp2\" (UID: \"269f6148-d710-4a17-9180-0258afb04709\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.694938 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx8gr\" (UniqueName: \"kubernetes.io/projected/0cca4cb1-dbd6-45f5-b888-34735df9ca51-kube-api-access-mx8gr\") pod \"octavia-operator-controller-manager-69f8888797-bgw24\" (UID: \"0cca4cb1-dbd6-45f5-b888-34735df9ca51\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.694988 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.696811 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.698795 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.704352 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-q7vxz" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.717698 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.721590 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx8gr\" (UniqueName: \"kubernetes.io/projected/0cca4cb1-dbd6-45f5-b888-34735df9ca51-kube-api-access-mx8gr\") pod \"octavia-operator-controller-manager-69f8888797-bgw24\" (UID: \"0cca4cb1-dbd6-45f5-b888-34735df9ca51\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.721870 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glxl7\" (UniqueName: \"kubernetes.io/projected/784b9cd8-45d9-4167-b857-edc06f4ce473-kube-api-access-glxl7\") pod \"nova-operator-controller-manager-567668f5cf-9bx28\" (UID: \"784b9cd8-45d9-4167-b857-edc06f4ce473\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.732364 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.750189 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.764780 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kbgq\" (UniqueName: \"kubernetes.io/projected/269f6148-d710-4a17-9180-0258afb04709-kube-api-access-5kbgq\") pod \"neutron-operator-controller-manager-64ddbf8bb-ghxp2\" (UID: \"269f6148-d710-4a17-9180-0258afb04709\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.790436 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.790987 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.796773 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.797832 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.798835 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.798899 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4ksb\" (UniqueName: \"kubernetes.io/projected/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-kube-api-access-n4ksb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.798955 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbw59\" (UniqueName: \"kubernetes.io/projected/87dceaba-8b74-4e4a-847e-cfcb186347d5-kube-api-access-bbw59\") pod \"ovn-operator-controller-manager-d44cf6b75-bb2nv\" (UID: \"87dceaba-8b74-4e4a-847e-cfcb186347d5\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.798982 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckr86\" (UniqueName: \"kubernetes.io/projected/9f177e4f-9739-4a2b-924c-c07a176b7e06-kube-api-access-ckr86\") pod \"placement-operator-controller-manager-8497b45c89-7vr5w\" (UID: \"9f177e4f-9739-4a2b-924c-c07a176b7e06\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" Feb 18 12:07:00 crc kubenswrapper[4880]: E0218 12:07:00.799627 4880 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:00 crc kubenswrapper[4880]: E0218 12:07:00.799690 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert podName:b9151c49-6a98-487d-a1a5-4c01a1d43bbf nodeName:}" failed. No retries permitted until 2026-02-18 12:07:01.29966437 +0000 UTC m=+928.728565231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" (UID: "b9151c49-6a98-487d-a1a5-4c01a1d43bbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.801912 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-sv2x8" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.832516 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.833847 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4ksb\" (UniqueName: \"kubernetes.io/projected/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-kube-api-access-n4ksb\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.833907 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.836643 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xlnkc" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.852049 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.865995 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.893983 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.900254 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbw59\" (UniqueName: \"kubernetes.io/projected/87dceaba-8b74-4e4a-847e-cfcb186347d5-kube-api-access-bbw59\") pod \"ovn-operator-controller-manager-d44cf6b75-bb2nv\" (UID: \"87dceaba-8b74-4e4a-847e-cfcb186347d5\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.900303 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckr86\" (UniqueName: \"kubernetes.io/projected/9f177e4f-9739-4a2b-924c-c07a176b7e06-kube-api-access-ckr86\") pod \"placement-operator-controller-manager-8497b45c89-7vr5w\" (UID: \"9f177e4f-9739-4a2b-924c-c07a176b7e06\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.900407 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzsqv\" (UniqueName: \"kubernetes.io/projected/996aea3a-0693-43ee-b40a-f623c674cc12-kube-api-access-pzsqv\") pod \"swift-operator-controller-manager-68f46476f-ldqtr\" (UID: \"996aea3a-0693-43ee-b40a-f623c674cc12\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.901247 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.913655 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.917297 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-59vtb"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.918296 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.921087 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-r7pgb" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.927835 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckr86\" (UniqueName: \"kubernetes.io/projected/9f177e4f-9739-4a2b-924c-c07a176b7e06-kube-api-access-ckr86\") pod \"placement-operator-controller-manager-8497b45c89-7vr5w\" (UID: \"9f177e4f-9739-4a2b-924c-c07a176b7e06\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.928757 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-59vtb"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.936819 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbw59\" (UniqueName: \"kubernetes.io/projected/87dceaba-8b74-4e4a-847e-cfcb186347d5-kube-api-access-bbw59\") pod \"ovn-operator-controller-manager-d44cf6b75-bb2nv\" (UID: \"87dceaba-8b74-4e4a-847e-cfcb186347d5\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.959190 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.961851 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.964042 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.966508 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4bhgr" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.988570 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx"] Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.989791 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.993967 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.994249 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 18 12:07:00 crc kubenswrapper[4880]: I0218 12:07:00.994549 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p7v5j" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.010475 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.012815 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzsqv\" (UniqueName: \"kubernetes.io/projected/996aea3a-0693-43ee-b40a-f623c674cc12-kube-api-access-pzsqv\") pod \"swift-operator-controller-manager-68f46476f-ldqtr\" (UID: \"996aea3a-0693-43ee-b40a-f623c674cc12\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.013126 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59h5\" (UniqueName: \"kubernetes.io/projected/be820469-d6fd-48f8-bf42-e03cd3f95634-kube-api-access-c59h5\") pod \"telemetry-operator-controller-manager-7f45b4ff68-q6zml\" (UID: \"be820469-d6fd-48f8-bf42-e03cd3f95634\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.013230 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.013504 4880 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.013599 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert podName:eaad9a94-37d2-4df9-ad82-1728dde9a0c4 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:02.013579597 +0000 UTC m=+929.442480458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert") pod "infra-operator-controller-manager-79d975b745-k2psm" (UID: "eaad9a94-37d2-4df9-ad82-1728dde9a0c4") : secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.014598 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.030554 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.031664 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.037437 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.037963 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzsqv\" (UniqueName: \"kubernetes.io/projected/996aea3a-0693-43ee-b40a-f623c674cc12-kube-api-access-pzsqv\") pod \"swift-operator-controller-manager-68f46476f-ldqtr\" (UID: \"996aea3a-0693-43ee-b40a-f623c674cc12\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.038773 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-j8v5j" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.082417 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.092191 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.103851 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.115302 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xflq9\" (UniqueName: \"kubernetes.io/projected/247f4692-a10b-46e0-8304-9effd006efa4-kube-api-access-xflq9\") pod \"test-operator-controller-manager-7866795846-59vtb\" (UID: \"247f4692-a10b-46e0-8304-9effd006efa4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.115405 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59h5\" (UniqueName: \"kubernetes.io/projected/be820469-d6fd-48f8-bf42-e03cd3f95634-kube-api-access-c59h5\") pod \"telemetry-operator-controller-manager-7f45b4ff68-q6zml\" (UID: \"be820469-d6fd-48f8-bf42-e03cd3f95634\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.115432 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wfb\" (UniqueName: \"kubernetes.io/projected/dae0ccf1-c72f-4a57-960a-af1ed14922d7-kube-api-access-92wfb\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.115463 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784d7\" (UniqueName: \"kubernetes.io/projected/d09ec4e5-9afc-49c9-b2f0-92daacefb795-kube-api-access-784d7\") pod \"watcher-operator-controller-manager-6788bf7557-9kzzr\" (UID: \"d09ec4e5-9afc-49c9-b2f0-92daacefb795\") " pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.115543 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.115570 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.126827 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.141309 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.149935 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59h5\" (UniqueName: \"kubernetes.io/projected/be820469-d6fd-48f8-bf42-e03cd3f95634-kube-api-access-c59h5\") pod \"telemetry-operator-controller-manager-7f45b4ff68-q6zml\" (UID: \"be820469-d6fd-48f8-bf42-e03cd3f95634\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.163082 4880 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.166193 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.216521 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wfb\" (UniqueName: \"kubernetes.io/projected/dae0ccf1-c72f-4a57-960a-af1ed14922d7-kube-api-access-92wfb\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.217880 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784d7\" (UniqueName: \"kubernetes.io/projected/d09ec4e5-9afc-49c9-b2f0-92daacefb795-kube-api-access-784d7\") pod \"watcher-operator-controller-manager-6788bf7557-9kzzr\" (UID: \"d09ec4e5-9afc-49c9-b2f0-92daacefb795\") " pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.217935 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wp6z\" (UniqueName: \"kubernetes.io/projected/ec4d0dea-02ec-481b-969c-0f8a0567c836-kube-api-access-9wp6z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-q7fsv\" (UID: \"ec4d0dea-02ec-481b-969c-0f8a0567c836\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.218016 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.218053 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.218084 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xflq9\" (UniqueName: \"kubernetes.io/projected/247f4692-a10b-46e0-8304-9effd006efa4-kube-api-access-xflq9\") pod \"test-operator-controller-manager-7866795846-59vtb\" (UID: \"247f4692-a10b-46e0-8304-9effd006efa4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.219802 4880 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.219875 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:01.719851219 +0000 UTC m=+929.148752090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "metrics-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.219894 4880 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.219936 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:01.719923041 +0000 UTC m=+929.148823902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.248213 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xflq9\" (UniqueName: \"kubernetes.io/projected/247f4692-a10b-46e0-8304-9effd006efa4-kube-api-access-xflq9\") pod \"test-operator-controller-manager-7866795846-59vtb\" (UID: \"247f4692-a10b-46e0-8304-9effd006efa4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.251949 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784d7\" (UniqueName: \"kubernetes.io/projected/d09ec4e5-9afc-49c9-b2f0-92daacefb795-kube-api-access-784d7\") pod \"watcher-operator-controller-manager-6788bf7557-9kzzr\" (UID: \"d09ec4e5-9afc-49c9-b2f0-92daacefb795\") " pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.276955 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wfb\" (UniqueName: \"kubernetes.io/projected/dae0ccf1-c72f-4a57-960a-af1ed14922d7-kube-api-access-92wfb\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: W0218 12:07:01.298684 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd00b294_1b81_4988_bf25_d5681534e3f1.slice/crio-b96a4bb2a057f016741395acfe7b84d9dd3ec3be9e14d8b5fa3358d971ac0ab6 WatchSource:0}: Error finding container b96a4bb2a057f016741395acfe7b84d9dd3ec3be9e14d8b5fa3358d971ac0ab6: Status 404 returned error can't find the container with id b96a4bb2a057f016741395acfe7b84d9dd3ec3be9e14d8b5fa3358d971ac0ab6 Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.299892 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" event={"ID":"1d6b527e-fad8-4853-a0ab-d9fa69c934ff","Type":"ContainerStarted","Data":"31557c09ac2368f6c795419830bdea3c68579e6ff7db370d951c7b80a5bab305"} Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.302097 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.321045 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.329297 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wp6z\" (UniqueName: \"kubernetes.io/projected/ec4d0dea-02ec-481b-969c-0f8a0567c836-kube-api-access-9wp6z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-q7fsv\" (UID: \"ec4d0dea-02ec-481b-969c-0f8a0567c836\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.329595 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.329949 4880 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.330091 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert podName:b9151c49-6a98-487d-a1a5-4c01a1d43bbf nodeName:}" failed. No retries permitted until 2026-02-18 12:07:02.330068061 +0000 UTC m=+929.758968932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" (UID: "b9151c49-6a98-487d-a1a5-4c01a1d43bbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: W0218 12:07:01.338469 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod479cc862_303e_4c55_9818_4c4cd04a5987.slice/crio-b966f6982cb88eec1eb08cbb7157f31b9170c62621905a0c3d08efdfdd5b64cd WatchSource:0}: Error finding container b966f6982cb88eec1eb08cbb7157f31b9170c62621905a0c3d08efdfdd5b64cd: Status 404 returned error can't find the container with id b966f6982cb88eec1eb08cbb7157f31b9170c62621905a0c3d08efdfdd5b64cd Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.352928 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wp6z\" (UniqueName: \"kubernetes.io/projected/ec4d0dea-02ec-481b-969c-0f8a0567c836-kube-api-access-9wp6z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-q7fsv\" (UID: \"ec4d0dea-02ec-481b-969c-0f8a0567c836\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.408010 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.546262 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.739337 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.739387 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.739691 4880 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.739748 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:02.739729396 +0000 UTC m=+930.168630257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "webhook-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.739794 4880 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.739813 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:02.739805418 +0000 UTC m=+930.168706279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "metrics-server-cert" not found Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.819596 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.828523 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.848973 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9"] Feb 18 12:07:01 crc kubenswrapper[4880]: W0218 12:07:01.860730 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7975dd71_7d05_400c_9058_c76c811d0bb0.slice/crio-2353cadcb48e61eb3a8680a9a355e691139a8f473aaea5bf115a6320cdcfe141 WatchSource:0}: Error finding container 2353cadcb48e61eb3a8680a9a355e691139a8f473aaea5bf115a6320cdcfe141: Status 404 returned error can't find the container with id 2353cadcb48e61eb3a8680a9a355e691139a8f473aaea5bf115a6320cdcfe141 Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.869333 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.876738 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.884284 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.888270 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.919379 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.922088 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28"] Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.930161 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr"] Feb 18 12:07:01 crc kubenswrapper[4880]: W0218 12:07:01.939602 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87dceaba_8b74_4e4a_847e_cfcb186347d5.slice/crio-b6661be7a4d9fbcb5e5d42d2425c3d47d4eed7a60983a6391cb146fd05e9b38e WatchSource:0}: Error finding container b6661be7a4d9fbcb5e5d42d2425c3d47d4eed7a60983a6391cb146fd05e9b38e: Status 404 returned error can't find the container with id b6661be7a4d9fbcb5e5d42d2425c3d47d4eed7a60983a6391cb146fd05e9b38e Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.940597 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24"] Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.941062 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bbw59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-bb2nv_openstack-operators(87dceaba-8b74-4e4a-847e-cfcb186347d5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.942258 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" podUID="87dceaba-8b74-4e4a-847e-cfcb186347d5" Feb 18 12:07:01 crc kubenswrapper[4880]: W0218 12:07:01.944242 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cca4cb1_dbd6_45f5_b888_34735df9ca51.slice/crio-dadcce9ce3eb573ef17ab67d7e7f298bf8def52b6a4fa203dd604649c2e4cbd0 WatchSource:0}: Error finding container dadcce9ce3eb573ef17ab67d7e7f298bf8def52b6a4fa203dd604649c2e4cbd0: Status 404 returned error can't find the container with id dadcce9ce3eb573ef17ab67d7e7f298bf8def52b6a4fa203dd604649c2e4cbd0 Feb 18 12:07:01 crc kubenswrapper[4880]: I0218 12:07:01.945770 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv"] Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.947717 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mx8gr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-bgw24_openstack-operators(0cca4cb1-dbd6-45f5-b888-34735df9ca51): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 12:07:01 crc kubenswrapper[4880]: E0218 12:07:01.948835 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" podUID="0cca4cb1-dbd6-45f5-b888-34735df9ca51" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.044789 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.044964 4880 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.045039 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert podName:eaad9a94-37d2-4df9-ad82-1728dde9a0c4 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:04.04500668 +0000 UTC m=+931.473907541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert") pod "infra-operator-controller-manager-79d975b745-k2psm" (UID: "eaad9a94-37d2-4df9-ad82-1728dde9a0c4") : secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.145802 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr"] Feb 18 12:07:02 crc kubenswrapper[4880]: W0218 12:07:02.149822 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd09ec4e5_9afc_49c9_b2f0_92daacefb795.slice/crio-a5d313b479bb8d80853048b9c3c23c5452b1281885714916dd7c2aa7ea36bbdb WatchSource:0}: Error finding container a5d313b479bb8d80853048b9c3c23c5452b1281885714916dd7c2aa7ea36bbdb: Status 404 returned error can't find the container with id a5d313b479bb8d80853048b9c3c23c5452b1281885714916dd7c2aa7ea36bbdb Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.156203 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ckr86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-7vr5w_openstack-operators(9f177e4f-9739-4a2b-924c-c07a176b7e06): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.157633 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" podUID="9f177e4f-9739-4a2b-924c-c07a176b7e06" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.157698 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv"] Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.157979 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9wp6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-q7fsv_openstack-operators(ec4d0dea-02ec-481b-969c-0f8a0567c836): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.159159 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" podUID="ec4d0dea-02ec-481b-969c-0f8a0567c836" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.168296 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w"] Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.175053 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml"] Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.187754 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c59h5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-q6zml_openstack-operators(be820469-d6fd-48f8-bf42-e03cd3f95634): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.189021 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" podUID="be820469-d6fd-48f8-bf42-e03cd3f95634" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.302383 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-59vtb"] Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.341558 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" event={"ID":"bf26df20-6bff-4038-bfcb-c87b39c061ed","Type":"ContainerStarted","Data":"262ca625987f65c5e1ad8e374665254762b9a5892c94d2c0a64debdbd63a34f6"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.354581 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.354813 4880 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.354928 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert podName:b9151c49-6a98-487d-a1a5-4c01a1d43bbf nodeName:}" failed. No retries permitted until 2026-02-18 12:07:04.354903035 +0000 UTC m=+931.783803896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" (UID: "b9151c49-6a98-487d-a1a5-4c01a1d43bbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.360305 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" event={"ID":"0cca4cb1-dbd6-45f5-b888-34735df9ca51","Type":"ContainerStarted","Data":"dadcce9ce3eb573ef17ab67d7e7f298bf8def52b6a4fa203dd604649c2e4cbd0"} Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.363691 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" podUID="0cca4cb1-dbd6-45f5-b888-34735df9ca51" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.367514 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" event={"ID":"58de4ed7-c54f-4970-99ab-dae7521db195","Type":"ContainerStarted","Data":"2ee97075fb02ee3ea0623e14d6ac728975757430a90b99c126bb06570f653ba1"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.369731 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" event={"ID":"fd00b294-1b81-4988-bf25-d5681534e3f1","Type":"ContainerStarted","Data":"b96a4bb2a057f016741395acfe7b84d9dd3ec3be9e14d8b5fa3358d971ac0ab6"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.373165 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" event={"ID":"df1f8ad9-7e58-4e54-b189-d994806814bf","Type":"ContainerStarted","Data":"5954ec96faf30c9012f5b767c807c36909996b8e6a6badd8b4d46e3f3ff7dce5"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.379895 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" event={"ID":"479cc862-303e-4c55-9818-4c4cd04a5987","Type":"ContainerStarted","Data":"b966f6982cb88eec1eb08cbb7157f31b9170c62621905a0c3d08efdfdd5b64cd"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.383010 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" event={"ID":"be820469-d6fd-48f8-bf42-e03cd3f95634","Type":"ContainerStarted","Data":"636a5bd8fe0406f736905b519dc4b1f1e8beb33a7dd38394b49aa56c8b0e5d82"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.385342 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" event={"ID":"784b9cd8-45d9-4167-b857-edc06f4ce473","Type":"ContainerStarted","Data":"637406c98b34e4aef2e94b54b42d1e5d23fab227e4795fe0108f3d994438e9dc"} Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.388106 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" podUID="be820469-d6fd-48f8-bf42-e03cd3f95634" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.388892 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" event={"ID":"269f6148-d710-4a17-9180-0258afb04709","Type":"ContainerStarted","Data":"9765253015a91c2fb5a2748ce72c30619e9aa0b3d0c9d76b26286d4faf59bddf"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.392394 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" event={"ID":"5d4d2d05-77bc-4e37-8450-b2281aaa468a","Type":"ContainerStarted","Data":"3d994d35a16607f364e827af41e8c00c8ca2ded1a89ff152707f1e9cecbabea0"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.395029 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" event={"ID":"cb351566-0fe8-4583-86c9-cd123efb0db6","Type":"ContainerStarted","Data":"f03627b94c6e949d62cf75c9421ff7e79c6dea907743a577b53fcd2e83ab2637"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.397314 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" event={"ID":"ec4d0dea-02ec-481b-969c-0f8a0567c836","Type":"ContainerStarted","Data":"62a4cbf55c4b6091cf88bf95577fe46b2a3cb265f1582b971debf3f4fa7f8943"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.399800 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" event={"ID":"9f177e4f-9739-4a2b-924c-c07a176b7e06","Type":"ContainerStarted","Data":"1939486217ebcecccb70b78f714e7167b0ea7cb9beb5c63e3009160401787564"} Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.406023 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" podUID="9f177e4f-9739-4a2b-924c-c07a176b7e06" Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.406131 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" podUID="ec4d0dea-02ec-481b-969c-0f8a0567c836" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.410553 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" event={"ID":"996aea3a-0693-43ee-b40a-f623c674cc12","Type":"ContainerStarted","Data":"49b838ef0c1ab1fe11973bf7046457dec476318c4e742c8a9af4b65a17ead03e"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.421164 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" event={"ID":"87dceaba-8b74-4e4a-847e-cfcb186347d5","Type":"ContainerStarted","Data":"b6661be7a4d9fbcb5e5d42d2425c3d47d4eed7a60983a6391cb146fd05e9b38e"} Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.443075 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" podUID="87dceaba-8b74-4e4a-847e-cfcb186347d5" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.445488 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" event={"ID":"7975dd71-7d05-400c-9058-c76c811d0bb0","Type":"ContainerStarted","Data":"2353cadcb48e61eb3a8680a9a355e691139a8f473aaea5bf115a6320cdcfe141"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.446679 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" event={"ID":"bfb70827-0634-4721-8cac-3d1a0e4153a6","Type":"ContainerStarted","Data":"5da0a722e85648129395c45533b4b1c391d702b3119bd3570cc3ae96bd201654"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.455255 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" event={"ID":"d09ec4e5-9afc-49c9-b2f0-92daacefb795","Type":"ContainerStarted","Data":"a5d313b479bb8d80853048b9c3c23c5452b1281885714916dd7c2aa7ea36bbdb"} Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.763520 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:02 crc kubenswrapper[4880]: I0218 12:07:02.763585 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.763798 4880 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.763915 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:04.763887494 +0000 UTC m=+932.192788435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "metrics-server-cert" not found Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.763965 4880 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 12:07:02 crc kubenswrapper[4880]: E0218 12:07:02.764094 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:04.764060508 +0000 UTC m=+932.192961519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "webhook-server-cert" not found Feb 18 12:07:03 crc kubenswrapper[4880]: I0218 12:07:03.490584 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" event={"ID":"247f4692-a10b-46e0-8304-9effd006efa4","Type":"ContainerStarted","Data":"73024fe66450f727d957af9e740f01394a07e1b81119393a32e2b88dbe56c52f"} Feb 18 12:07:03 crc kubenswrapper[4880]: E0218 12:07:03.492571 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" podUID="be820469-d6fd-48f8-bf42-e03cd3f95634" Feb 18 12:07:03 crc kubenswrapper[4880]: E0218 12:07:03.492982 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" podUID="87dceaba-8b74-4e4a-847e-cfcb186347d5" Feb 18 12:07:03 crc kubenswrapper[4880]: E0218 12:07:03.493004 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" podUID="ec4d0dea-02ec-481b-969c-0f8a0567c836" Feb 18 12:07:03 crc kubenswrapper[4880]: E0218 12:07:03.493808 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" podUID="0cca4cb1-dbd6-45f5-b888-34735df9ca51" Feb 18 12:07:03 crc kubenswrapper[4880]: E0218 12:07:03.494080 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" podUID="9f177e4f-9739-4a2b-924c-c07a176b7e06" Feb 18 12:07:04 crc kubenswrapper[4880]: I0218 12:07:04.094197 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.094584 4880 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.094671 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert podName:eaad9a94-37d2-4df9-ad82-1728dde9a0c4 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:08.094650789 +0000 UTC m=+935.523551650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert") pod "infra-operator-controller-manager-79d975b745-k2psm" (UID: "eaad9a94-37d2-4df9-ad82-1728dde9a0c4") : secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:04 crc kubenswrapper[4880]: I0218 12:07:04.402805 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.403029 4880 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.403112 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert podName:b9151c49-6a98-487d-a1a5-4c01a1d43bbf nodeName:}" failed. No retries permitted until 2026-02-18 12:07:08.403091009 +0000 UTC m=+935.831991870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" (UID: "b9151c49-6a98-487d-a1a5-4c01a1d43bbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:04 crc kubenswrapper[4880]: I0218 12:07:04.815328 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:04 crc kubenswrapper[4880]: I0218 12:07:04.815391 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.815525 4880 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.815625 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:08.815588892 +0000 UTC m=+936.244489753 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "metrics-server-cert" not found Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.815629 4880 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 12:07:04 crc kubenswrapper[4880]: E0218 12:07:04.815688 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:08.815669574 +0000 UTC m=+936.244570425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "webhook-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: I0218 12:07:08.172191 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.172422 4880 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.173111 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert podName:eaad9a94-37d2-4df9-ad82-1728dde9a0c4 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:16.173087765 +0000 UTC m=+943.601988626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert") pod "infra-operator-controller-manager-79d975b745-k2psm" (UID: "eaad9a94-37d2-4df9-ad82-1728dde9a0c4") : secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: I0218 12:07:08.476754 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.476992 4880 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.477105 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert podName:b9151c49-6a98-487d-a1a5-4c01a1d43bbf nodeName:}" failed. No retries permitted until 2026-02-18 12:07:16.477077589 +0000 UTC m=+943.905978520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" (UID: "b9151c49-6a98-487d-a1a5-4c01a1d43bbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: I0218 12:07:08.883431 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:08 crc kubenswrapper[4880]: I0218 12:07:08.883518 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.883718 4880 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.883821 4880 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.883831 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:16.883802743 +0000 UTC m=+944.312703674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "metrics-server-cert" not found Feb 18 12:07:08 crc kubenswrapper[4880]: E0218 12:07:08.883954 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:16.883927626 +0000 UTC m=+944.312828557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "webhook-server-cert" not found Feb 18 12:07:14 crc kubenswrapper[4880]: E0218 12:07:14.747261 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc" Feb 18 12:07:14 crc kubenswrapper[4880]: E0218 12:07:14.748196 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tpzt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-868647ff47-t4mk9_openstack-operators(cb351566-0fe8-4583-86c9-cd123efb0db6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 12:07:14 crc kubenswrapper[4880]: E0218 12:07:14.749551 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" podUID="cb351566-0fe8-4583-86c9-cd123efb0db6" Feb 18 12:07:15 crc kubenswrapper[4880]: E0218 12:07:15.281662 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df" Feb 18 12:07:15 crc kubenswrapper[4880]: E0218 12:07:15.281960 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7bbrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987464f4-mcs6b_openstack-operators(7975dd71-7d05-400c-9058-c76c811d0bb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 12:07:15 crc kubenswrapper[4880]: E0218 12:07:15.283077 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" podUID="7975dd71-7d05-400c-9058-c76c811d0bb0" Feb 18 12:07:15 crc kubenswrapper[4880]: E0218 12:07:15.595756 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" podUID="7975dd71-7d05-400c-9058-c76c811d0bb0" Feb 18 12:07:15 crc kubenswrapper[4880]: E0218 12:07:15.595824 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" podUID="cb351566-0fe8-4583-86c9-cd123efb0db6" Feb 18 12:07:16 crc kubenswrapper[4880]: I0218 12:07:16.208586 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.208780 4880 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.208834 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert podName:eaad9a94-37d2-4df9-ad82-1728dde9a0c4 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:32.208815503 +0000 UTC m=+959.637716364 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert") pod "infra-operator-controller-manager-79d975b745-k2psm" (UID: "eaad9a94-37d2-4df9-ad82-1728dde9a0c4") : secret "infra-operator-webhook-server-cert" not found Feb 18 12:07:16 crc kubenswrapper[4880]: I0218 12:07:16.513947 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.514179 4880 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.514274 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert podName:b9151c49-6a98-487d-a1a5-4c01a1d43bbf nodeName:}" failed. No retries permitted until 2026-02-18 12:07:32.514249831 +0000 UTC m=+959.943150692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" (UID: "b9151c49-6a98-487d-a1a5-4c01a1d43bbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 12:07:16 crc kubenswrapper[4880]: I0218 12:07:16.918512 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:16 crc kubenswrapper[4880]: I0218 12:07:16.918589 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.918833 4880 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.918898 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:32.918876545 +0000 UTC m=+960.347777406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "webhook-server-cert" not found Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.918913 4880 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 12:07:16 crc kubenswrapper[4880]: E0218 12:07:16.919048 4880 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs podName:dae0ccf1-c72f-4a57-960a-af1ed14922d7 nodeName:}" failed. No retries permitted until 2026-02-18 12:07:32.919015368 +0000 UTC m=+960.347916269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs") pod "openstack-operator-controller-manager-78cf7c9dc-t68gx" (UID: "dae0ccf1-c72f-4a57-960a-af1ed14922d7") : secret "metrics-server-cert" not found Feb 18 12:07:17 crc kubenswrapper[4880]: E0218 12:07:17.231388 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 18 12:07:17 crc kubenswrapper[4880]: E0218 12:07:17.231817 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-glxl7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-9bx28_openstack-operators(784b9cd8-45d9-4167-b857-edc06f4ce473): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 12:07:17 crc kubenswrapper[4880]: E0218 12:07:17.233047 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" podUID="784b9cd8-45d9-4167-b857-edc06f4ce473" Feb 18 12:07:17 crc kubenswrapper[4880]: E0218 12:07:17.609094 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" podUID="784b9cd8-45d9-4167-b857-edc06f4ce473" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.625982 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" event={"ID":"479cc862-303e-4c55-9818-4c4cd04a5987","Type":"ContainerStarted","Data":"f394436bac28573e432fd8932fae07d2c1107ac5b71b12b5e4b9de643532834a"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.627490 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.640484 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" event={"ID":"996aea3a-0693-43ee-b40a-f623c674cc12","Type":"ContainerStarted","Data":"4a3d5b95b2d33d0fd5de456ae571a08763a388c986d165e79530e6a0e5a54cbf"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.640551 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.653204 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" podStartSLOduration=2.700727356 podStartE2EDuration="18.653188528s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.362127312 +0000 UTC m=+928.791028173" lastFinishedPulling="2026-02-18 12:07:17.314588484 +0000 UTC m=+944.743489345" observedRunningTime="2026-02-18 12:07:18.652729287 +0000 UTC m=+946.081630148" watchObservedRunningTime="2026-02-18 12:07:18.653188528 +0000 UTC m=+946.082089389" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.654580 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" event={"ID":"269f6148-d710-4a17-9180-0258afb04709","Type":"ContainerStarted","Data":"b38b90bc3d71c76a6477c072fcee16b545cdfcb5ce89a4797839e3f63a37019f"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.655118 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.660006 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" event={"ID":"58de4ed7-c54f-4970-99ab-dae7521db195","Type":"ContainerStarted","Data":"4e33cf565cb9ade13be3fd6905d330fa52510075676f91d447d1b0b5787b6983"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.660257 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.662025 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" event={"ID":"5d4d2d05-77bc-4e37-8450-b2281aaa468a","Type":"ContainerStarted","Data":"838232187deb43a36fd0a08260e4791596bb1d16911e9cdc22111fd15673861e"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.662757 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.668335 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" event={"ID":"bf26df20-6bff-4038-bfcb-c87b39c061ed","Type":"ContainerStarted","Data":"9ef74df400a34cb9884aaadc568ea6076c1211ae23abae4e742e05f012873ad2"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.668528 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.671932 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" event={"ID":"1d6b527e-fad8-4853-a0ab-d9fa69c934ff","Type":"ContainerStarted","Data":"cca9ba31c0df23befc1001c33359977dd531198310e5e14c222ca445cd1fb839"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.672117 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.678629 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" podStartSLOduration=3.260229236 podStartE2EDuration="18.677487733s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.939337338 +0000 UTC m=+929.368238199" lastFinishedPulling="2026-02-18 12:07:17.356595845 +0000 UTC m=+944.785496696" observedRunningTime="2026-02-18 12:07:18.671447567 +0000 UTC m=+946.100348438" watchObservedRunningTime="2026-02-18 12:07:18.677487733 +0000 UTC m=+946.106388594" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.679939 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" event={"ID":"bfb70827-0634-4721-8cac-3d1a0e4153a6","Type":"ContainerStarted","Data":"35cb430af45189cab4d15483789b60221a5b0e0c930663dbe2741bfa9d998b80"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.680043 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.688745 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" event={"ID":"247f4692-a10b-46e0-8304-9effd006efa4","Type":"ContainerStarted","Data":"e94b84ab6da7e0fe2e4ec00bbb48abcb3ee76d489647d194f26eda3538654892"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.689178 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.691966 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" event={"ID":"d09ec4e5-9afc-49c9-b2f0-92daacefb795","Type":"ContainerStarted","Data":"7b8e32b5aa5b97e89fe9e43dd665580de5601fb23935740b068a6e0535af1a97"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.692661 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.703520 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" event={"ID":"fd00b294-1b81-4988-bf25-d5681534e3f1","Type":"ContainerStarted","Data":"2d116aedc5a3cb510ad27373c21191dd6b5a3a6dd4818155f22be16f6bdf7fd4"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.703846 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.709203 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" podStartSLOduration=3.252399757 podStartE2EDuration="18.709176865s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.899975511 +0000 UTC m=+929.328876372" lastFinishedPulling="2026-02-18 12:07:17.356752619 +0000 UTC m=+944.785653480" observedRunningTime="2026-02-18 12:07:18.698319594 +0000 UTC m=+946.127220455" watchObservedRunningTime="2026-02-18 12:07:18.709176865 +0000 UTC m=+946.138077726" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.724885 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" event={"ID":"df1f8ad9-7e58-4e54-b189-d994806814bf","Type":"ContainerStarted","Data":"2382f20694d3f035fe56ff145e9a6a59beafe75942fb1f2d3790f3524e645f9f"} Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.725677 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.741430 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" podStartSLOduration=3.278289329 podStartE2EDuration="18.7414098s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.850600323 +0000 UTC m=+929.279501184" lastFinishedPulling="2026-02-18 12:07:17.313720794 +0000 UTC m=+944.742621655" observedRunningTime="2026-02-18 12:07:18.740020397 +0000 UTC m=+946.168921268" watchObservedRunningTime="2026-02-18 12:07:18.7414098 +0000 UTC m=+946.170310661" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.771554 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" podStartSLOduration=3.285099733 podStartE2EDuration="18.771534985s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.896366424 +0000 UTC m=+929.325267285" lastFinishedPulling="2026-02-18 12:07:17.382801676 +0000 UTC m=+944.811702537" observedRunningTime="2026-02-18 12:07:18.769911536 +0000 UTC m=+946.198812397" watchObservedRunningTime="2026-02-18 12:07:18.771534985 +0000 UTC m=+946.200435846" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.803545 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" podStartSLOduration=3.368707825 podStartE2EDuration="18.803521415s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.939270806 +0000 UTC m=+929.368171667" lastFinishedPulling="2026-02-18 12:07:17.374084396 +0000 UTC m=+944.802985257" observedRunningTime="2026-02-18 12:07:18.797889559 +0000 UTC m=+946.226790430" watchObservedRunningTime="2026-02-18 12:07:18.803521415 +0000 UTC m=+946.232422276" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.838747 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" podStartSLOduration=3.679712785 podStartE2EDuration="18.838717031s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:02.153016578 +0000 UTC m=+929.581917439" lastFinishedPulling="2026-02-18 12:07:17.312020824 +0000 UTC m=+944.740921685" observedRunningTime="2026-02-18 12:07:18.836224371 +0000 UTC m=+946.265125232" watchObservedRunningTime="2026-02-18 12:07:18.838717031 +0000 UTC m=+946.267617892" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.861683 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" podStartSLOduration=3.823565581 podStartE2EDuration="19.861652433s" podCreationTimestamp="2026-02-18 12:06:59 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.308727608 +0000 UTC m=+928.737628469" lastFinishedPulling="2026-02-18 12:07:17.34681446 +0000 UTC m=+944.775715321" observedRunningTime="2026-02-18 12:07:18.852916023 +0000 UTC m=+946.281816884" watchObservedRunningTime="2026-02-18 12:07:18.861652433 +0000 UTC m=+946.290553294" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.871551 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" podStartSLOduration=3.681070472 podStartE2EDuration="19.8715263s" podCreationTimestamp="2026-02-18 12:06:59 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.162813907 +0000 UTC m=+928.591714768" lastFinishedPulling="2026-02-18 12:07:17.353269725 +0000 UTC m=+944.782170596" observedRunningTime="2026-02-18 12:07:18.869694287 +0000 UTC m=+946.298595158" watchObservedRunningTime="2026-02-18 12:07:18.8715263 +0000 UTC m=+946.300427161" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.931226 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" podStartSLOduration=3.911051162 podStartE2EDuration="18.931206046s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:02.33640002 +0000 UTC m=+929.765300881" lastFinishedPulling="2026-02-18 12:07:17.356554894 +0000 UTC m=+944.785455765" observedRunningTime="2026-02-18 12:07:18.896158683 +0000 UTC m=+946.325059554" watchObservedRunningTime="2026-02-18 12:07:18.931206046 +0000 UTC m=+946.360106907" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.943713 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" podStartSLOduration=3.406704828 podStartE2EDuration="18.943583364s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.850034529 +0000 UTC m=+929.278935390" lastFinishedPulling="2026-02-18 12:07:17.386913065 +0000 UTC m=+944.815813926" observedRunningTime="2026-02-18 12:07:18.920018897 +0000 UTC m=+946.348919758" watchObservedRunningTime="2026-02-18 12:07:18.943583364 +0000 UTC m=+946.372484235" Feb 18 12:07:18 crc kubenswrapper[4880]: I0218 12:07:18.957246 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" podStartSLOduration=3.489987753 podStartE2EDuration="18.957221693s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.860838719 +0000 UTC m=+929.289739580" lastFinishedPulling="2026-02-18 12:07:17.328072649 +0000 UTC m=+944.756973520" observedRunningTime="2026-02-18 12:07:18.95009878 +0000 UTC m=+946.378999641" watchObservedRunningTime="2026-02-18 12:07:18.957221693 +0000 UTC m=+946.386122544" Feb 18 12:07:23 crc kubenswrapper[4880]: I0218 12:07:23.274601 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:07:23 crc kubenswrapper[4880]: I0218 12:07:23.275044 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:07:30 crc kubenswrapper[4880]: E0218 12:07:30.283732 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 18 12:07:30 crc kubenswrapper[4880]: E0218 12:07:30.285186 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mx8gr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-bgw24_openstack-operators(0cca4cb1-dbd6-45f5-b888-34735df9ca51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 12:07:30 crc kubenswrapper[4880]: E0218 12:07:30.286662 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" podUID="0cca4cb1-dbd6-45f5-b888-34735df9ca51" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.338965 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bmrlg" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.388971 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vjrxb" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.459581 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-4pvlt" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.542932 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vkgzq" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.617918 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-kwz6h" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.674525 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6wl45" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.794306 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-kv9jb" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.897263 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ghxp2" Feb 18 12:07:30 crc kubenswrapper[4880]: I0218 12:07:30.905114 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-2r47l" Feb 18 12:07:31 crc kubenswrapper[4880]: E0218 12:07:31.051427 4880 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99" Feb 18 12:07:31 crc kubenswrapper[4880]: E0218 12:07:31.051694 4880 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c59h5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-q6zml_openstack-operators(be820469-d6fd-48f8-bf42-e03cd3f95634): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 12:07:31 crc kubenswrapper[4880]: E0218 12:07:31.052884 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" podUID="be820469-d6fd-48f8-bf42-e03cd3f95634" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.130516 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ldqtr" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.324573 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6788bf7557-9kzzr" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.554464 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-59vtb" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.830216 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" event={"ID":"cb351566-0fe8-4583-86c9-cd123efb0db6","Type":"ContainerStarted","Data":"8cd9e0853df5d614d052a5ca1dc524753fb0f1476a8af3cb23092416db77fb3e"} Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.832028 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" event={"ID":"ec4d0dea-02ec-481b-969c-0f8a0567c836","Type":"ContainerStarted","Data":"8310ef7c41e4e635b945db4d0a7dd50eb71a0666d9275f143023183b6e99b072"} Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.833890 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" event={"ID":"87dceaba-8b74-4e4a-847e-cfcb186347d5","Type":"ContainerStarted","Data":"647feeb1b112295a8db29186fe37e17ecf3bd000fd9e92ee539129af3d6cb6a6"} Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.834130 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.835149 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" event={"ID":"9f177e4f-9739-4a2b-924c-c07a176b7e06","Type":"ContainerStarted","Data":"1b37a788b6ecefefa087a68694becb4e0c74ecf659dc73e00227ef24262eee92"} Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.835310 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.836672 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" event={"ID":"784b9cd8-45d9-4167-b857-edc06f4ce473","Type":"ContainerStarted","Data":"91bd735f68054c70baf51a9d506edd8ae59652473114100a9241db29cfd4f9c8"} Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.837028 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.838361 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" event={"ID":"7975dd71-7d05-400c-9058-c76c811d0bb0","Type":"ContainerStarted","Data":"cd8d3776ab330410dfc4bf491b5525cfc65ba7d471734923d9163872e0a6dc21"} Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.839017 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.852116 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" podStartSLOduration=3.567103217 podStartE2EDuration="32.852096986s" podCreationTimestamp="2026-02-18 12:06:59 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.851032383 +0000 UTC m=+929.279933244" lastFinishedPulling="2026-02-18 12:07:31.136026152 +0000 UTC m=+958.564927013" observedRunningTime="2026-02-18 12:07:31.850714548 +0000 UTC m=+959.279615409" watchObservedRunningTime="2026-02-18 12:07:31.852096986 +0000 UTC m=+959.280997847" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.865714 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" podStartSLOduration=2.666461684 podStartE2EDuration="31.865695929s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.939298777 +0000 UTC m=+929.368199638" lastFinishedPulling="2026-02-18 12:07:31.138533022 +0000 UTC m=+958.567433883" observedRunningTime="2026-02-18 12:07:31.863506139 +0000 UTC m=+959.292407000" watchObservedRunningTime="2026-02-18 12:07:31.865695929 +0000 UTC m=+959.294596790" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.880366 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" podStartSLOduration=2.9023521580000002 podStartE2EDuration="31.880351611s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:02.156099642 +0000 UTC m=+929.585000503" lastFinishedPulling="2026-02-18 12:07:31.134099095 +0000 UTC m=+958.562999956" observedRunningTime="2026-02-18 12:07:31.878393178 +0000 UTC m=+959.307294039" watchObservedRunningTime="2026-02-18 12:07:31.880351611 +0000 UTC m=+959.309252472" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.899371 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-q7fsv" podStartSLOduration=2.955853441 podStartE2EDuration="31.899354214s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:02.157851534 +0000 UTC m=+929.586752415" lastFinishedPulling="2026-02-18 12:07:31.101352327 +0000 UTC m=+958.530253188" observedRunningTime="2026-02-18 12:07:31.897232126 +0000 UTC m=+959.326132997" watchObservedRunningTime="2026-02-18 12:07:31.899354214 +0000 UTC m=+959.328255075" Feb 18 12:07:31 crc kubenswrapper[4880]: I0218 12:07:31.915063 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" podStartSLOduration=2.65141188 podStartE2EDuration="31.915045154s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.871626639 +0000 UTC m=+929.300527500" lastFinishedPulling="2026-02-18 12:07:31.135259913 +0000 UTC m=+958.564160774" observedRunningTime="2026-02-18 12:07:31.911456316 +0000 UTC m=+959.340357177" watchObservedRunningTime="2026-02-18 12:07:31.915045154 +0000 UTC m=+959.343946025" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.221572 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.230186 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaad9a94-37d2-4df9-ad82-1728dde9a0c4-cert\") pod \"infra-operator-controller-manager-79d975b745-k2psm\" (UID: \"eaad9a94-37d2-4df9-ad82-1728dde9a0c4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.381485 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.525755 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.535640 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9151c49-6a98-487d-a1a5-4c01a1d43bbf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw\" (UID: \"b9151c49-6a98-487d-a1a5-4c01a1d43bbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.554149 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.618955 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" podStartSLOduration=3.504183287 podStartE2EDuration="32.61893378s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.940955796 +0000 UTC m=+929.369856657" lastFinishedPulling="2026-02-18 12:07:31.055706289 +0000 UTC m=+958.484607150" observedRunningTime="2026-02-18 12:07:31.935671501 +0000 UTC m=+959.364572362" watchObservedRunningTime="2026-02-18 12:07:32.61893378 +0000 UTC m=+960.047834641" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.623384 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-k2psm"] Feb 18 12:07:32 crc kubenswrapper[4880]: W0218 12:07:32.634244 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaad9a94_37d2_4df9_ad82_1728dde9a0c4.slice/crio-46fa80509d564ee72fd9d53473212c8a037c61cd262672770cb35e96583f3675 WatchSource:0}: Error finding container 46fa80509d564ee72fd9d53473212c8a037c61cd262672770cb35e96583f3675: Status 404 returned error can't find the container with id 46fa80509d564ee72fd9d53473212c8a037c61cd262672770cb35e96583f3675 Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.846875 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" event={"ID":"eaad9a94-37d2-4df9-ad82-1728dde9a0c4","Type":"ContainerStarted","Data":"46fa80509d564ee72fd9d53473212c8a037c61cd262672770cb35e96583f3675"} Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.932432 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.932497 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.936907 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-metrics-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:32 crc kubenswrapper[4880]: I0218 12:07:32.937025 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dae0ccf1-c72f-4a57-960a-af1ed14922d7-webhook-certs\") pod \"openstack-operator-controller-manager-78cf7c9dc-t68gx\" (UID: \"dae0ccf1-c72f-4a57-960a-af1ed14922d7\") " pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.005411 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw"] Feb 18 12:07:33 crc kubenswrapper[4880]: W0218 12:07:33.016656 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9151c49_6a98_487d_a1a5_4c01a1d43bbf.slice/crio-4a9a891140f499eba78a90192c5f5bb7e75b4659d5de32955f2259e6d3951581 WatchSource:0}: Error finding container 4a9a891140f499eba78a90192c5f5bb7e75b4659d5de32955f2259e6d3951581: Status 404 returned error can't find the container with id 4a9a891140f499eba78a90192c5f5bb7e75b4659d5de32955f2259e6d3951581 Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.169001 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p7v5j" Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.177052 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.634186 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx"] Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.859501 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" event={"ID":"dae0ccf1-c72f-4a57-960a-af1ed14922d7","Type":"ContainerStarted","Data":"b56fbfedfe1984c030335080a85db6166f26f822ff37aa29e9b4146e4c9b3a13"} Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.859555 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" event={"ID":"dae0ccf1-c72f-4a57-960a-af1ed14922d7","Type":"ContainerStarted","Data":"b5bf0c44c967c95a5c7ee85f89b00f15a3c6b1df93069c0d51260d1aacda393f"} Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.859657 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.861773 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" event={"ID":"b9151c49-6a98-487d-a1a5-4c01a1d43bbf","Type":"ContainerStarted","Data":"4a9a891140f499eba78a90192c5f5bb7e75b4659d5de32955f2259e6d3951581"} Feb 18 12:07:33 crc kubenswrapper[4880]: I0218 12:07:33.897795 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" podStartSLOduration=33.89777449 podStartE2EDuration="33.89777449s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 12:07:33.893598865 +0000 UTC m=+961.322499736" watchObservedRunningTime="2026-02-18 12:07:33.89777449 +0000 UTC m=+961.326675351" Feb 18 12:07:36 crc kubenswrapper[4880]: I0218 12:07:36.889656 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" event={"ID":"eaad9a94-37d2-4df9-ad82-1728dde9a0c4","Type":"ContainerStarted","Data":"b2074df6dbd9774fdad1b77f1146a26be9b99cce21d1bb9fc5bf28d1088c4efb"} Feb 18 12:07:36 crc kubenswrapper[4880]: I0218 12:07:36.890449 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:36 crc kubenswrapper[4880]: I0218 12:07:36.891911 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" event={"ID":"b9151c49-6a98-487d-a1a5-4c01a1d43bbf","Type":"ContainerStarted","Data":"fc466f9d2a4c1c3dfaf8c8f81cdda461fe943f413e693f60e89631113afe2f04"} Feb 18 12:07:36 crc kubenswrapper[4880]: I0218 12:07:36.892098 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:36 crc kubenswrapper[4880]: I0218 12:07:36.911938 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" podStartSLOduration=33.696552655 podStartE2EDuration="36.911909244s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:32.637382996 +0000 UTC m=+960.066283857" lastFinishedPulling="2026-02-18 12:07:35.852739585 +0000 UTC m=+963.281640446" observedRunningTime="2026-02-18 12:07:36.90995347 +0000 UTC m=+964.338854361" watchObservedRunningTime="2026-02-18 12:07:36.911909244 +0000 UTC m=+964.340810125" Feb 18 12:07:36 crc kubenswrapper[4880]: I0218 12:07:36.946103 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" podStartSLOduration=34.089038671 podStartE2EDuration="36.946080111s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:33.020073913 +0000 UTC m=+960.448974774" lastFinishedPulling="2026-02-18 12:07:35.877115353 +0000 UTC m=+963.306016214" observedRunningTime="2026-02-18 12:07:36.94054117 +0000 UTC m=+964.369442051" watchObservedRunningTime="2026-02-18 12:07:36.946080111 +0000 UTC m=+964.374980972" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.404410 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f7qmg"] Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.405997 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.419954 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7qmg"] Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.608326 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95pb\" (UniqueName: \"kubernetes.io/projected/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-kube-api-access-c95pb\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.608375 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-utilities\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.608425 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-catalog-content\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.709874 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c95pb\" (UniqueName: \"kubernetes.io/projected/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-kube-api-access-c95pb\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.709976 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-utilities\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.710008 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-catalog-content\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.710496 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-utilities\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.710575 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-catalog-content\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:37 crc kubenswrapper[4880]: I0218 12:07:37.733731 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95pb\" (UniqueName: \"kubernetes.io/projected/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-kube-api-access-c95pb\") pod \"certified-operators-f7qmg\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:38 crc kubenswrapper[4880]: I0218 12:07:38.022649 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:38 crc kubenswrapper[4880]: I0218 12:07:38.467690 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7qmg"] Feb 18 12:07:38 crc kubenswrapper[4880]: W0218 12:07:38.473218 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9d3c2c3_f753_4b7f_8267_4a47d2e2bfe8.slice/crio-ba5c207b9fc3c103be8277675a631c0b5ae32508c953f4da3f52de221e91bc01 WatchSource:0}: Error finding container ba5c207b9fc3c103be8277675a631c0b5ae32508c953f4da3f52de221e91bc01: Status 404 returned error can't find the container with id ba5c207b9fc3c103be8277675a631c0b5ae32508c953f4da3f52de221e91bc01 Feb 18 12:07:38 crc kubenswrapper[4880]: I0218 12:07:38.906992 4880 generic.go:334] "Generic (PLEG): container finished" podID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerID="8cabe0247109d50f92333bcc60aa9c86e3a44fcb4706dfbc579be69824d239d6" exitCode=0 Feb 18 12:07:38 crc kubenswrapper[4880]: I0218 12:07:38.907055 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7qmg" event={"ID":"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8","Type":"ContainerDied","Data":"8cabe0247109d50f92333bcc60aa9c86e3a44fcb4706dfbc579be69824d239d6"} Feb 18 12:07:38 crc kubenswrapper[4880]: I0218 12:07:38.907121 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7qmg" event={"ID":"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8","Type":"ContainerStarted","Data":"ba5c207b9fc3c103be8277675a631c0b5ae32508c953f4da3f52de221e91bc01"} Feb 18 12:07:40 crc kubenswrapper[4880]: I0218 12:07:40.608919 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" Feb 18 12:07:40 crc kubenswrapper[4880]: I0218 12:07:40.611339 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-t4mk9" Feb 18 12:07:40 crc kubenswrapper[4880]: I0218 12:07:40.735542 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-mcs6b" Feb 18 12:07:40 crc kubenswrapper[4880]: I0218 12:07:40.912519 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9bx28" Feb 18 12:07:40 crc kubenswrapper[4880]: I0218 12:07:40.924663 4880 generic.go:334] "Generic (PLEG): container finished" podID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerID="5522c2095e3cc612f611ab1f83dd7dcf1b9e6344f2d48392bc9f6014cffb9a32" exitCode=0 Feb 18 12:07:40 crc kubenswrapper[4880]: I0218 12:07:40.924755 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7qmg" event={"ID":"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8","Type":"ContainerDied","Data":"5522c2095e3cc612f611ab1f83dd7dcf1b9e6344f2d48392bc9f6014cffb9a32"} Feb 18 12:07:41 crc kubenswrapper[4880]: I0218 12:07:41.086012 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-bb2nv" Feb 18 12:07:41 crc kubenswrapper[4880]: I0218 12:07:41.106471 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7vr5w" Feb 18 12:07:41 crc kubenswrapper[4880]: I0218 12:07:41.933879 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7qmg" event={"ID":"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8","Type":"ContainerStarted","Data":"667198f82c04ed25da2255071f97a79fe65918a6be1be44c97757f599e5b25bd"} Feb 18 12:07:41 crc kubenswrapper[4880]: I0218 12:07:41.957440 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f7qmg" podStartSLOduration=2.173828896 podStartE2EDuration="4.957415971s" podCreationTimestamp="2026-02-18 12:07:37 +0000 UTC" firstStartedPulling="2026-02-18 12:07:38.908238173 +0000 UTC m=+966.337139024" lastFinishedPulling="2026-02-18 12:07:41.691825238 +0000 UTC m=+969.120726099" observedRunningTime="2026-02-18 12:07:41.951295522 +0000 UTC m=+969.380196393" watchObservedRunningTime="2026-02-18 12:07:41.957415971 +0000 UTC m=+969.386316832" Feb 18 12:07:42 crc kubenswrapper[4880]: E0218 12:07:42.182094 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" podUID="0cca4cb1-dbd6-45f5-b888-34735df9ca51" Feb 18 12:07:42 crc kubenswrapper[4880]: E0218 12:07:42.182124 4880 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" podUID="be820469-d6fd-48f8-bf42-e03cd3f95634" Feb 18 12:07:42 crc kubenswrapper[4880]: I0218 12:07:42.387789 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-k2psm" Feb 18 12:07:42 crc kubenswrapper[4880]: I0218 12:07:42.560439 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw" Feb 18 12:07:43 crc kubenswrapper[4880]: I0218 12:07:43.190127 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-78cf7c9dc-t68gx" Feb 18 12:07:48 crc kubenswrapper[4880]: I0218 12:07:48.023703 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:48 crc kubenswrapper[4880]: I0218 12:07:48.024160 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:48 crc kubenswrapper[4880]: I0218 12:07:48.079390 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:49 crc kubenswrapper[4880]: I0218 12:07:49.018683 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:49 crc kubenswrapper[4880]: I0218 12:07:49.083279 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7qmg"] Feb 18 12:07:50 crc kubenswrapper[4880]: I0218 12:07:50.726560 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kpj5b"] Feb 18 12:07:50 crc kubenswrapper[4880]: I0218 12:07:50.729161 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:50 crc kubenswrapper[4880]: I0218 12:07:50.736050 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kpj5b"] Feb 18 12:07:50 crc kubenswrapper[4880]: I0218 12:07:50.898216 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-catalog-content\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:50 crc kubenswrapper[4880]: I0218 12:07:50.898429 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g57k\" (UniqueName: \"kubernetes.io/projected/58b05405-b705-4fe2-9e25-53a678f59145-kube-api-access-8g57k\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:50 crc kubenswrapper[4880]: I0218 12:07:50.898642 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-utilities\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:50 crc kubenswrapper[4880]: I0218 12:07:50.992554 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f7qmg" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="registry-server" containerID="cri-o://667198f82c04ed25da2255071f97a79fe65918a6be1be44c97757f599e5b25bd" gracePeriod=2 Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.000546 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g57k\" (UniqueName: \"kubernetes.io/projected/58b05405-b705-4fe2-9e25-53a678f59145-kube-api-access-8g57k\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.000681 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-utilities\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.000721 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-catalog-content\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.001210 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-utilities\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.001327 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-catalog-content\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.024090 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g57k\" (UniqueName: \"kubernetes.io/projected/58b05405-b705-4fe2-9e25-53a678f59145-kube-api-access-8g57k\") pod \"community-operators-kpj5b\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.056959 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:07:51 crc kubenswrapper[4880]: I0218 12:07:51.602137 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kpj5b"] Feb 18 12:07:51 crc kubenswrapper[4880]: W0218 12:07:51.608850 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58b05405_b705_4fe2_9e25_53a678f59145.slice/crio-36bd0c3ea8cd4ce4d256143ef8caa7e1bdf02afb7ce37e0138e737d32d6aacd5 WatchSource:0}: Error finding container 36bd0c3ea8cd4ce4d256143ef8caa7e1bdf02afb7ce37e0138e737d32d6aacd5: Status 404 returned error can't find the container with id 36bd0c3ea8cd4ce4d256143ef8caa7e1bdf02afb7ce37e0138e737d32d6aacd5 Feb 18 12:07:52 crc kubenswrapper[4880]: I0218 12:07:52.004894 4880 generic.go:334] "Generic (PLEG): container finished" podID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerID="667198f82c04ed25da2255071f97a79fe65918a6be1be44c97757f599e5b25bd" exitCode=0 Feb 18 12:07:52 crc kubenswrapper[4880]: I0218 12:07:52.004977 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7qmg" event={"ID":"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8","Type":"ContainerDied","Data":"667198f82c04ed25da2255071f97a79fe65918a6be1be44c97757f599e5b25bd"} Feb 18 12:07:52 crc kubenswrapper[4880]: I0218 12:07:52.009116 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpj5b" event={"ID":"58b05405-b705-4fe2-9e25-53a678f59145","Type":"ContainerStarted","Data":"d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31"} Feb 18 12:07:52 crc kubenswrapper[4880]: I0218 12:07:52.009269 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpj5b" event={"ID":"58b05405-b705-4fe2-9e25-53a678f59145","Type":"ContainerStarted","Data":"36bd0c3ea8cd4ce4d256143ef8caa7e1bdf02afb7ce37e0138e737d32d6aacd5"} Feb 18 12:07:53 crc kubenswrapper[4880]: I0218 12:07:53.018162 4880 generic.go:334] "Generic (PLEG): container finished" podID="58b05405-b705-4fe2-9e25-53a678f59145" containerID="d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31" exitCode=0 Feb 18 12:07:53 crc kubenswrapper[4880]: I0218 12:07:53.018218 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpj5b" event={"ID":"58b05405-b705-4fe2-9e25-53a678f59145","Type":"ContainerDied","Data":"d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31"} Feb 18 12:07:53 crc kubenswrapper[4880]: I0218 12:07:53.274085 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:07:53 crc kubenswrapper[4880]: I0218 12:07:53.274150 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:07:53 crc kubenswrapper[4880]: I0218 12:07:53.274198 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 12:07:53 crc kubenswrapper[4880]: I0218 12:07:53.274834 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d675a88a38ecd91c14a002c1ada584ee97f355276bf8e2925348d67f9f1846e3"} pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:07:53 crc kubenswrapper[4880]: I0218 12:07:53.274890 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" containerID="cri-o://d675a88a38ecd91c14a002c1ada584ee97f355276bf8e2925348d67f9f1846e3" gracePeriod=600 Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.000219 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.031497 4880 generic.go:334] "Generic (PLEG): container finished" podID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerID="d675a88a38ecd91c14a002c1ada584ee97f355276bf8e2925348d67f9f1846e3" exitCode=0 Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.031582 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerDied","Data":"d675a88a38ecd91c14a002c1ada584ee97f355276bf8e2925348d67f9f1846e3"} Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.031664 4880 scope.go:117] "RemoveContainer" containerID="ffb3f920a68020d12bee188c9effd93292465486a03ad6482f8c49de81c5a836" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.034127 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7qmg" event={"ID":"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8","Type":"ContainerDied","Data":"ba5c207b9fc3c103be8277675a631c0b5ae32508c953f4da3f52de221e91bc01"} Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.034203 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7qmg" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.062333 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-catalog-content\") pod \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.062416 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c95pb\" (UniqueName: \"kubernetes.io/projected/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-kube-api-access-c95pb\") pod \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.062563 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-utilities\") pod \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\" (UID: \"a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8\") " Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.063720 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-utilities" (OuterVolumeSpecName: "utilities") pod "a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" (UID: "a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.068827 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-kube-api-access-c95pb" (OuterVolumeSpecName: "kube-api-access-c95pb") pod "a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" (UID: "a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8"). InnerVolumeSpecName "kube-api-access-c95pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.118239 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" (UID: "a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.164293 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.164329 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.164340 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c95pb\" (UniqueName: \"kubernetes.io/projected/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8-kube-api-access-c95pb\") on node \"crc\" DevicePath \"\"" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.213988 4880 scope.go:117] "RemoveContainer" containerID="667198f82c04ed25da2255071f97a79fe65918a6be1be44c97757f599e5b25bd" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.242912 4880 scope.go:117] "RemoveContainer" containerID="5522c2095e3cc612f611ab1f83dd7dcf1b9e6344f2d48392bc9f6014cffb9a32" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.294316 4880 scope.go:117] "RemoveContainer" containerID="8cabe0247109d50f92333bcc60aa9c86e3a44fcb4706dfbc579be69824d239d6" Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.381511 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7qmg"] Feb 18 12:07:54 crc kubenswrapper[4880]: I0218 12:07:54.394916 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f7qmg"] Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.048050 4880 generic.go:334] "Generic (PLEG): container finished" podID="58b05405-b705-4fe2-9e25-53a678f59145" containerID="6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540" exitCode=0 Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.048176 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpj5b" event={"ID":"58b05405-b705-4fe2-9e25-53a678f59145","Type":"ContainerDied","Data":"6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540"} Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.052574 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"db56eae41c466e7b8fe82658ddc5618a34717bb6d7aa0b464d124a307e8a7668"} Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.189136 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" path="/var/lib/kubelet/pods/a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8/volumes" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.724058 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rj97k"] Feb 18 12:07:55 crc kubenswrapper[4880]: E0218 12:07:55.724796 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="registry-server" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.724812 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="registry-server" Feb 18 12:07:55 crc kubenswrapper[4880]: E0218 12:07:55.724842 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="extract-utilities" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.724857 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="extract-utilities" Feb 18 12:07:55 crc kubenswrapper[4880]: E0218 12:07:55.724865 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="extract-content" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.724871 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="extract-content" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.724994 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d3c2c3-f753-4b7f-8267-4a47d2e2bfe8" containerName="registry-server" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.726009 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.745737 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj97k"] Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.889651 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhdmx\" (UniqueName: \"kubernetes.io/projected/23e1082c-6106-415c-ae61-fcfb3dcde076-kube-api-access-nhdmx\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.890306 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-catalog-content\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.890501 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-utilities\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.991600 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-utilities\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.991772 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhdmx\" (UniqueName: \"kubernetes.io/projected/23e1082c-6106-415c-ae61-fcfb3dcde076-kube-api-access-nhdmx\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.991803 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-catalog-content\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.992498 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-utilities\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:55 crc kubenswrapper[4880]: I0218 12:07:55.992547 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-catalog-content\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.015574 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhdmx\" (UniqueName: \"kubernetes.io/projected/23e1082c-6106-415c-ae61-fcfb3dcde076-kube-api-access-nhdmx\") pod \"redhat-marketplace-rj97k\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.042189 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.104094 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpj5b" event={"ID":"58b05405-b705-4fe2-9e25-53a678f59145","Type":"ContainerStarted","Data":"23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff"} Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.108589 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" event={"ID":"be820469-d6fd-48f8-bf42-e03cd3f95634","Type":"ContainerStarted","Data":"241aff833a4621dd0dd02c865596c68cd7d774b8519c023e4b81be6342b4469f"} Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.109483 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.128088 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kpj5b" podStartSLOduration=3.651912693 podStartE2EDuration="6.128071718s" podCreationTimestamp="2026-02-18 12:07:50 +0000 UTC" firstStartedPulling="2026-02-18 12:07:53.020201649 +0000 UTC m=+980.449102510" lastFinishedPulling="2026-02-18 12:07:55.496360674 +0000 UTC m=+982.925261535" observedRunningTime="2026-02-18 12:07:56.1245508 +0000 UTC m=+983.553451671" watchObservedRunningTime="2026-02-18 12:07:56.128071718 +0000 UTC m=+983.556972579" Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.141852 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" podStartSLOduration=3.388634748 podStartE2EDuration="56.141829574s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:02.187544768 +0000 UTC m=+929.616445629" lastFinishedPulling="2026-02-18 12:07:54.940739594 +0000 UTC m=+982.369640455" observedRunningTime="2026-02-18 12:07:56.139083557 +0000 UTC m=+983.567984428" watchObservedRunningTime="2026-02-18 12:07:56.141829574 +0000 UTC m=+983.570730435" Feb 18 12:07:56 crc kubenswrapper[4880]: I0218 12:07:56.501146 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj97k"] Feb 18 12:07:57 crc kubenswrapper[4880]: I0218 12:07:57.118486 4880 generic.go:334] "Generic (PLEG): container finished" podID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerID="6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b" exitCode=0 Feb 18 12:07:57 crc kubenswrapper[4880]: I0218 12:07:57.118568 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj97k" event={"ID":"23e1082c-6106-415c-ae61-fcfb3dcde076","Type":"ContainerDied","Data":"6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b"} Feb 18 12:07:57 crc kubenswrapper[4880]: I0218 12:07:57.119099 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj97k" event={"ID":"23e1082c-6106-415c-ae61-fcfb3dcde076","Type":"ContainerStarted","Data":"e984974c3c0eb87ab17a3ec60fab55bea36a0d62e08f86886c4c5d86a24318b2"} Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.057181 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.058050 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.105195 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.146998 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" event={"ID":"0cca4cb1-dbd6-45f5-b888-34735df9ca51","Type":"ContainerStarted","Data":"e01fb30d881130d7a32d64e611109b046bac9cd028635f74b3be686d4918d75c"} Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.147392 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.149326 4880 generic.go:334] "Generic (PLEG): container finished" podID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerID="55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1" exitCode=0 Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.150762 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj97k" event={"ID":"23e1082c-6106-415c-ae61-fcfb3dcde076","Type":"ContainerDied","Data":"55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1"} Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.170269 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-q6zml" Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.184578 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" podStartSLOduration=2.589468676 podStartE2EDuration="1m1.18456147s" podCreationTimestamp="2026-02-18 12:07:00 +0000 UTC" firstStartedPulling="2026-02-18 12:07:01.947536565 +0000 UTC m=+929.376437416" lastFinishedPulling="2026-02-18 12:08:00.542629339 +0000 UTC m=+987.971530210" observedRunningTime="2026-02-18 12:08:01.16316367 +0000 UTC m=+988.592064541" watchObservedRunningTime="2026-02-18 12:08:01.18456147 +0000 UTC m=+988.613462331" Feb 18 12:08:01 crc kubenswrapper[4880]: I0218 12:08:01.202832 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:08:02 crc kubenswrapper[4880]: I0218 12:08:02.158662 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj97k" event={"ID":"23e1082c-6106-415c-ae61-fcfb3dcde076","Type":"ContainerStarted","Data":"b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7"} Feb 18 12:08:02 crc kubenswrapper[4880]: I0218 12:08:02.181109 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rj97k" podStartSLOduration=2.661664014 podStartE2EDuration="7.181089135s" podCreationTimestamp="2026-02-18 12:07:55 +0000 UTC" firstStartedPulling="2026-02-18 12:07:57.121709562 +0000 UTC m=+984.550610433" lastFinishedPulling="2026-02-18 12:08:01.641134693 +0000 UTC m=+989.070035554" observedRunningTime="2026-02-18 12:08:02.175172448 +0000 UTC m=+989.604073309" watchObservedRunningTime="2026-02-18 12:08:02.181089135 +0000 UTC m=+989.609989996" Feb 18 12:08:02 crc kubenswrapper[4880]: I0218 12:08:02.912386 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kpj5b"] Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.166877 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kpj5b" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="registry-server" containerID="cri-o://23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff" gracePeriod=2 Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.572421 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.720720 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-catalog-content\") pod \"58b05405-b705-4fe2-9e25-53a678f59145\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.721291 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g57k\" (UniqueName: \"kubernetes.io/projected/58b05405-b705-4fe2-9e25-53a678f59145-kube-api-access-8g57k\") pod \"58b05405-b705-4fe2-9e25-53a678f59145\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.721386 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-utilities\") pod \"58b05405-b705-4fe2-9e25-53a678f59145\" (UID: \"58b05405-b705-4fe2-9e25-53a678f59145\") " Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.722711 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-utilities" (OuterVolumeSpecName: "utilities") pod "58b05405-b705-4fe2-9e25-53a678f59145" (UID: "58b05405-b705-4fe2-9e25-53a678f59145"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.734919 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b05405-b705-4fe2-9e25-53a678f59145-kube-api-access-8g57k" (OuterVolumeSpecName: "kube-api-access-8g57k") pod "58b05405-b705-4fe2-9e25-53a678f59145" (UID: "58b05405-b705-4fe2-9e25-53a678f59145"). InnerVolumeSpecName "kube-api-access-8g57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.776042 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58b05405-b705-4fe2-9e25-53a678f59145" (UID: "58b05405-b705-4fe2-9e25-53a678f59145"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.823701 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.823741 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g57k\" (UniqueName: \"kubernetes.io/projected/58b05405-b705-4fe2-9e25-53a678f59145-kube-api-access-8g57k\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:03 crc kubenswrapper[4880]: I0218 12:08:03.823753 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58b05405-b705-4fe2-9e25-53a678f59145-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.176266 4880 generic.go:334] "Generic (PLEG): container finished" podID="58b05405-b705-4fe2-9e25-53a678f59145" containerID="23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff" exitCode=0 Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.176338 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpj5b" event={"ID":"58b05405-b705-4fe2-9e25-53a678f59145","Type":"ContainerDied","Data":"23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff"} Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.176357 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kpj5b" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.176394 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kpj5b" event={"ID":"58b05405-b705-4fe2-9e25-53a678f59145","Type":"ContainerDied","Data":"36bd0c3ea8cd4ce4d256143ef8caa7e1bdf02afb7ce37e0138e737d32d6aacd5"} Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.176424 4880 scope.go:117] "RemoveContainer" containerID="23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.194839 4880 scope.go:117] "RemoveContainer" containerID="6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.214679 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kpj5b"] Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.221921 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kpj5b"] Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.222791 4880 scope.go:117] "RemoveContainer" containerID="d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.248381 4880 scope.go:117] "RemoveContainer" containerID="23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff" Feb 18 12:08:04 crc kubenswrapper[4880]: E0218 12:08:04.249199 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff\": container with ID starting with 23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff not found: ID does not exist" containerID="23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.249237 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff"} err="failed to get container status \"23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff\": rpc error: code = NotFound desc = could not find container \"23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff\": container with ID starting with 23aac6da1013eb45366aedf9c3b10a58c2e9f528482bd1fc8390aa79f68ca2ff not found: ID does not exist" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.249264 4880 scope.go:117] "RemoveContainer" containerID="6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540" Feb 18 12:08:04 crc kubenswrapper[4880]: E0218 12:08:04.249723 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540\": container with ID starting with 6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540 not found: ID does not exist" containerID="6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.249773 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540"} err="failed to get container status \"6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540\": rpc error: code = NotFound desc = could not find container \"6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540\": container with ID starting with 6d623a4c6d465b641a22c7123f0dc4d2b68b3a73d67fb251b3a6d46b53672540 not found: ID does not exist" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.249805 4880 scope.go:117] "RemoveContainer" containerID="d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31" Feb 18 12:08:04 crc kubenswrapper[4880]: E0218 12:08:04.250231 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31\": container with ID starting with d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31 not found: ID does not exist" containerID="d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31" Feb 18 12:08:04 crc kubenswrapper[4880]: I0218 12:08:04.250253 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31"} err="failed to get container status \"d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31\": rpc error: code = NotFound desc = could not find container \"d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31\": container with ID starting with d82ca5703d49078c4b2598e1213a3ef48aeabf727be3065728fa5a789220ed31 not found: ID does not exist" Feb 18 12:08:05 crc kubenswrapper[4880]: I0218 12:08:05.191228 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58b05405-b705-4fe2-9e25-53a678f59145" path="/var/lib/kubelet/pods/58b05405-b705-4fe2-9e25-53a678f59145/volumes" Feb 18 12:08:06 crc kubenswrapper[4880]: I0218 12:08:06.043024 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:08:06 crc kubenswrapper[4880]: I0218 12:08:06.043072 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:08:06 crc kubenswrapper[4880]: I0218 12:08:06.092541 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:08:06 crc kubenswrapper[4880]: I0218 12:08:06.241013 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:08:07 crc kubenswrapper[4880]: I0218 12:08:07.314824 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj97k"] Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.206515 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rj97k" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="registry-server" containerID="cri-o://b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7" gracePeriod=2 Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.648257 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.796429 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhdmx\" (UniqueName: \"kubernetes.io/projected/23e1082c-6106-415c-ae61-fcfb3dcde076-kube-api-access-nhdmx\") pod \"23e1082c-6106-415c-ae61-fcfb3dcde076\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.796495 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-catalog-content\") pod \"23e1082c-6106-415c-ae61-fcfb3dcde076\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.796518 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-utilities\") pod \"23e1082c-6106-415c-ae61-fcfb3dcde076\" (UID: \"23e1082c-6106-415c-ae61-fcfb3dcde076\") " Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.797648 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-utilities" (OuterVolumeSpecName: "utilities") pod "23e1082c-6106-415c-ae61-fcfb3dcde076" (UID: "23e1082c-6106-415c-ae61-fcfb3dcde076"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.803032 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e1082c-6106-415c-ae61-fcfb3dcde076-kube-api-access-nhdmx" (OuterVolumeSpecName: "kube-api-access-nhdmx") pod "23e1082c-6106-415c-ae61-fcfb3dcde076" (UID: "23e1082c-6106-415c-ae61-fcfb3dcde076"). InnerVolumeSpecName "kube-api-access-nhdmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.818757 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23e1082c-6106-415c-ae61-fcfb3dcde076" (UID: "23e1082c-6106-415c-ae61-fcfb3dcde076"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.898828 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhdmx\" (UniqueName: \"kubernetes.io/projected/23e1082c-6106-415c-ae61-fcfb3dcde076-kube-api-access-nhdmx\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.899274 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:08 crc kubenswrapper[4880]: I0218 12:08:08.899289 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e1082c-6106-415c-ae61-fcfb3dcde076-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.216774 4880 generic.go:334] "Generic (PLEG): container finished" podID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerID="b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7" exitCode=0 Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.216822 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj97k" event={"ID":"23e1082c-6106-415c-ae61-fcfb3dcde076","Type":"ContainerDied","Data":"b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7"} Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.216853 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj97k" event={"ID":"23e1082c-6106-415c-ae61-fcfb3dcde076","Type":"ContainerDied","Data":"e984974c3c0eb87ab17a3ec60fab55bea36a0d62e08f86886c4c5d86a24318b2"} Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.216874 4880 scope.go:117] "RemoveContainer" containerID="b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.217041 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj97k" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.244538 4880 scope.go:117] "RemoveContainer" containerID="55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.245227 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj97k"] Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.259124 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj97k"] Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.263452 4880 scope.go:117] "RemoveContainer" containerID="6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.285280 4880 scope.go:117] "RemoveContainer" containerID="b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7" Feb 18 12:08:09 crc kubenswrapper[4880]: E0218 12:08:09.285796 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7\": container with ID starting with b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7 not found: ID does not exist" containerID="b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.285837 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7"} err="failed to get container status \"b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7\": rpc error: code = NotFound desc = could not find container \"b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7\": container with ID starting with b43be25e83a023a382b6b6ef5390780a7e1cff61943e4cfe85ba2bcb329095d7 not found: ID does not exist" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.285862 4880 scope.go:117] "RemoveContainer" containerID="55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1" Feb 18 12:08:09 crc kubenswrapper[4880]: E0218 12:08:09.286240 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1\": container with ID starting with 55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1 not found: ID does not exist" containerID="55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.286275 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1"} err="failed to get container status \"55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1\": rpc error: code = NotFound desc = could not find container \"55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1\": container with ID starting with 55bf572e4b7f102d67dc7288d6fe8ae889ac7927b4503c3935b583fd7faacbf1 not found: ID does not exist" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.286291 4880 scope.go:117] "RemoveContainer" containerID="6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b" Feb 18 12:08:09 crc kubenswrapper[4880]: E0218 12:08:09.287180 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b\": container with ID starting with 6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b not found: ID does not exist" containerID="6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b" Feb 18 12:08:09 crc kubenswrapper[4880]: I0218 12:08:09.287239 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b"} err="failed to get container status \"6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b\": rpc error: code = NotFound desc = could not find container \"6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b\": container with ID starting with 6b9a798e303434dd69095c756a74fce3d1e249c5f6a676288d7dde2a3f2ac00b not found: ID does not exist" Feb 18 12:08:11 crc kubenswrapper[4880]: I0218 12:08:11.019456 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-bgw24" Feb 18 12:08:11 crc kubenswrapper[4880]: I0218 12:08:11.189338 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" path="/var/lib/kubelet/pods/23e1082c-6106-415c-ae61-fcfb3dcde076/volumes" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.295336 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5mwfq/must-gather-92j92"] Feb 18 12:08:43 crc kubenswrapper[4880]: E0218 12:08:43.296545 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="registry-server" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296563 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="registry-server" Feb 18 12:08:43 crc kubenswrapper[4880]: E0218 12:08:43.296573 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="registry-server" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296580 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="registry-server" Feb 18 12:08:43 crc kubenswrapper[4880]: E0218 12:08:43.296598 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="extract-utilities" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296635 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="extract-utilities" Feb 18 12:08:43 crc kubenswrapper[4880]: E0218 12:08:43.296663 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="extract-content" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296670 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="extract-content" Feb 18 12:08:43 crc kubenswrapper[4880]: E0218 12:08:43.296684 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="extract-content" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296691 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="extract-content" Feb 18 12:08:43 crc kubenswrapper[4880]: E0218 12:08:43.296701 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="extract-utilities" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296720 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="extract-utilities" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296894 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b05405-b705-4fe2-9e25-53a678f59145" containerName="registry-server" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.296925 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e1082c-6106-415c-ae61-fcfb3dcde076" containerName="registry-server" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.298182 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.300852 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mwfq/must-gather-92j92"] Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.307581 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5mwfq"/"kube-root-ca.crt" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.308106 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5mwfq"/"openshift-service-ca.crt" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.381674 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjnbn\" (UniqueName: \"kubernetes.io/projected/a76872d3-673b-4433-bc92-39d7aa2af3f0-kube-api-access-fjnbn\") pod \"must-gather-92j92\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.381731 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a76872d3-673b-4433-bc92-39d7aa2af3f0-must-gather-output\") pod \"must-gather-92j92\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.483316 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjnbn\" (UniqueName: \"kubernetes.io/projected/a76872d3-673b-4433-bc92-39d7aa2af3f0-kube-api-access-fjnbn\") pod \"must-gather-92j92\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.483393 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a76872d3-673b-4433-bc92-39d7aa2af3f0-must-gather-output\") pod \"must-gather-92j92\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.483959 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a76872d3-673b-4433-bc92-39d7aa2af3f0-must-gather-output\") pod \"must-gather-92j92\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.505865 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjnbn\" (UniqueName: \"kubernetes.io/projected/a76872d3-673b-4433-bc92-39d7aa2af3f0-kube-api-access-fjnbn\") pod \"must-gather-92j92\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:43 crc kubenswrapper[4880]: I0218 12:08:43.650701 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:08:44 crc kubenswrapper[4880]: I0218 12:08:44.147415 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mwfq/must-gather-92j92"] Feb 18 12:08:44 crc kubenswrapper[4880]: W0218 12:08:44.152064 4880 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda76872d3_673b_4433_bc92_39d7aa2af3f0.slice/crio-b2b11de1d14e9b102fb921ba85f7f191caddeccd07ec7e2710396e7a4b0a2bbf WatchSource:0}: Error finding container b2b11de1d14e9b102fb921ba85f7f191caddeccd07ec7e2710396e7a4b0a2bbf: Status 404 returned error can't find the container with id b2b11de1d14e9b102fb921ba85f7f191caddeccd07ec7e2710396e7a4b0a2bbf Feb 18 12:08:44 crc kubenswrapper[4880]: I0218 12:08:44.541179 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mwfq/must-gather-92j92" event={"ID":"a76872d3-673b-4433-bc92-39d7aa2af3f0","Type":"ContainerStarted","Data":"b2b11de1d14e9b102fb921ba85f7f191caddeccd07ec7e2710396e7a4b0a2bbf"} Feb 18 12:08:50 crc kubenswrapper[4880]: I0218 12:08:50.585355 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mwfq/must-gather-92j92" event={"ID":"a76872d3-673b-4433-bc92-39d7aa2af3f0","Type":"ContainerStarted","Data":"acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c"} Feb 18 12:08:50 crc kubenswrapper[4880]: I0218 12:08:50.586111 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mwfq/must-gather-92j92" event={"ID":"a76872d3-673b-4433-bc92-39d7aa2af3f0","Type":"ContainerStarted","Data":"ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be"} Feb 18 12:08:50 crc kubenswrapper[4880]: I0218 12:08:50.605705 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5mwfq/must-gather-92j92" podStartSLOduration=1.7584050329999998 podStartE2EDuration="7.605688259s" podCreationTimestamp="2026-02-18 12:08:43 +0000 UTC" firstStartedPulling="2026-02-18 12:08:44.15526863 +0000 UTC m=+1031.584169491" lastFinishedPulling="2026-02-18 12:08:50.002551856 +0000 UTC m=+1037.431452717" observedRunningTime="2026-02-18 12:08:50.603185689 +0000 UTC m=+1038.032086570" watchObservedRunningTime="2026-02-18 12:08:50.605688259 +0000 UTC m=+1038.034589120" Feb 18 12:09:50 crc kubenswrapper[4880]: I0218 12:09:50.192574 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp_390a7018-a85a-45ce-85e5-215b6b67a5f3/util/0.log" Feb 18 12:09:50 crc kubenswrapper[4880]: I0218 12:09:50.545902 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp_390a7018-a85a-45ce-85e5-215b6b67a5f3/util/0.log" Feb 18 12:09:50 crc kubenswrapper[4880]: I0218 12:09:50.554553 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp_390a7018-a85a-45ce-85e5-215b6b67a5f3/pull/0.log" Feb 18 12:09:50 crc kubenswrapper[4880]: I0218 12:09:50.617104 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp_390a7018-a85a-45ce-85e5-215b6b67a5f3/pull/0.log" Feb 18 12:09:50 crc kubenswrapper[4880]: I0218 12:09:50.716184 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp_390a7018-a85a-45ce-85e5-215b6b67a5f3/extract/0.log" Feb 18 12:09:50 crc kubenswrapper[4880]: I0218 12:09:50.741537 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp_390a7018-a85a-45ce-85e5-215b6b67a5f3/pull/0.log" Feb 18 12:09:50 crc kubenswrapper[4880]: I0218 12:09:50.742698 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f8f3eba7f1491a30a2688ad7213d290d39c4514d03955d1b0eb7ec9f8bwgsp_390a7018-a85a-45ce-85e5-215b6b67a5f3/util/0.log" Feb 18 12:09:51 crc kubenswrapper[4880]: I0218 12:09:51.175731 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-vjrxb_fd00b294-1b81-4988-bf25-d5681534e3f1/manager/0.log" Feb 18 12:09:51 crc kubenswrapper[4880]: I0218 12:09:51.418469 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-mcs6b_7975dd71-7d05-400c-9058-c76c811d0bb0/manager/0.log" Feb 18 12:09:51 crc kubenswrapper[4880]: I0218 12:09:51.603381 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-4pvlt_479cc862-303e-4c55-9818-4c4cd04a5987/manager/0.log" Feb 18 12:09:51 crc kubenswrapper[4880]: I0218 12:09:51.687862 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-bmrlg_1d6b527e-fad8-4853-a0ab-d9fa69c934ff/manager/0.log" Feb 18 12:09:51 crc kubenswrapper[4880]: I0218 12:09:51.821180 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-vkgzq_df1f8ad9-7e58-4e54-b189-d994806814bf/manager/0.log" Feb 18 12:09:51 crc kubenswrapper[4880]: I0218 12:09:51.902054 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-k2psm_eaad9a94-37d2-4df9-ad82-1728dde9a0c4/manager/0.log" Feb 18 12:09:52 crc kubenswrapper[4880]: I0218 12:09:52.127708 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-kwz6h_bfb70827-0634-4721-8cac-3d1a0e4153a6/manager/0.log" Feb 18 12:09:52 crc kubenswrapper[4880]: I0218 12:09:52.277317 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-6wl45_5d4d2d05-77bc-4e37-8450-b2281aaa468a/manager/0.log" Feb 18 12:09:52 crc kubenswrapper[4880]: I0218 12:09:52.400131 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-kv9jb_bf26df20-6bff-4038-bfcb-c87b39c061ed/manager/0.log" Feb 18 12:09:52 crc kubenswrapper[4880]: I0218 12:09:52.599503 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-2r47l_58de4ed7-c54f-4970-99ab-dae7521db195/manager/0.log" Feb 18 12:09:52 crc kubenswrapper[4880]: I0218 12:09:52.829100 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-ghxp2_269f6148-d710-4a17-9180-0258afb04709/manager/0.log" Feb 18 12:09:52 crc kubenswrapper[4880]: I0218 12:09:52.907168 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-9bx28_784b9cd8-45d9-4167-b857-edc06f4ce473/manager/0.log" Feb 18 12:09:53 crc kubenswrapper[4880]: I0218 12:09:53.300296 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cpmrgw_b9151c49-6a98-487d-a1a5-4c01a1d43bbf/manager/0.log" Feb 18 12:09:53 crc kubenswrapper[4880]: I0218 12:09:53.776875 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-795b869f54-7wrfl_1b74510a-df13-4112-a258-4d1d08b5258e/operator/0.log" Feb 18 12:09:53 crc kubenswrapper[4880]: I0218 12:09:53.838282 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-78cf7c9dc-t68gx_dae0ccf1-c72f-4a57-960a-af1ed14922d7/manager/0.log" Feb 18 12:09:53 crc kubenswrapper[4880]: I0218 12:09:53.841983 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-bgw24_0cca4cb1-dbd6-45f5-b888-34735df9ca51/manager/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.041504 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-x2nkq_1a2e5b16-76c1-4ade-916d-0b4fd45b3acf/registry-server/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.245268 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-t4mk9_cb351566-0fe8-4583-86c9-cd123efb0db6/manager/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.460689 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-bb2nv_87dceaba-8b74-4e4a-847e-cfcb186347d5/manager/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.466722 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-7vr5w_9f177e4f-9739-4a2b-924c-c07a176b7e06/manager/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.602322 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-q7fsv_ec4d0dea-02ec-481b-969c-0f8a0567c836/operator/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.649895 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-ldqtr_996aea3a-0693-43ee-b40a-f623c674cc12/manager/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.701491 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-q6zml_be820469-d6fd-48f8-bf42-e03cd3f95634/manager/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.827264 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-59vtb_247f4692-a10b-46e0-8304-9effd006efa4/manager/0.log" Feb 18 12:09:54 crc kubenswrapper[4880]: I0218 12:09:54.900276 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6788bf7557-9kzzr_d09ec4e5-9afc-49c9-b2f0-92daacefb795/manager/0.log" Feb 18 12:10:12 crc kubenswrapper[4880]: I0218 12:10:12.473710 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-cbjbp_178d28de-f513-4c1f-bd35-886b84a1b892/control-plane-machine-set-operator/0.log" Feb 18 12:10:12 crc kubenswrapper[4880]: I0218 12:10:12.634661 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vdp59_7f8c1f5b-ef29-4f12-88af-7387efdd41e6/kube-rbac-proxy/0.log" Feb 18 12:10:12 crc kubenswrapper[4880]: I0218 12:10:12.676213 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vdp59_7f8c1f5b-ef29-4f12-88af-7387efdd41e6/machine-api-operator/0.log" Feb 18 12:10:23 crc kubenswrapper[4880]: I0218 12:10:23.275106 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:10:23 crc kubenswrapper[4880]: I0218 12:10:23.275725 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:10:24 crc kubenswrapper[4880]: I0218 12:10:24.782437 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-p4bzm_9d5b3735-fda7-4348-bd36-ffde74bed4d5/cert-manager-controller/0.log" Feb 18 12:10:24 crc kubenswrapper[4880]: I0218 12:10:24.956388 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-5jz97_3412227f-d74c-4cdd-a225-e2eaa1d64ad0/cert-manager-webhook/0.log" Feb 18 12:10:24 crc kubenswrapper[4880]: I0218 12:10:24.989447 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-lxkwt_73158f7c-619d-4202-ae40-543a797efbf2/cert-manager-cainjector/0.log" Feb 18 12:10:36 crc kubenswrapper[4880]: I0218 12:10:36.092161 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-sth8f_6303451a-3d25-4508-9af5-a67136a4fe25/nmstate-console-plugin/0.log" Feb 18 12:10:36 crc kubenswrapper[4880]: I0218 12:10:36.268880 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jkq5d_55b70fd6-133c-41d1-b461-9507862b44fd/nmstate-handler/0.log" Feb 18 12:10:36 crc kubenswrapper[4880]: I0218 12:10:36.292039 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mwnbs_abed6693-ac3d-4890-a45d-c9781587622e/kube-rbac-proxy/0.log" Feb 18 12:10:36 crc kubenswrapper[4880]: I0218 12:10:36.329418 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mwnbs_abed6693-ac3d-4890-a45d-c9781587622e/nmstate-metrics/0.log" Feb 18 12:10:36 crc kubenswrapper[4880]: I0218 12:10:36.486143 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-t6282_0dedf240-e215-4de3-8794-b1223146ba9e/nmstate-webhook/0.log" Feb 18 12:10:36 crc kubenswrapper[4880]: I0218 12:10:36.486294 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-cv2ht_233947b8-94c0-4814-b63d-cfbce95bc5e3/nmstate-operator/0.log" Feb 18 12:10:48 crc kubenswrapper[4880]: I0218 12:10:48.841501 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-csbj4_44a09229-68d6-48b2-99c3-cd3d3d9d2d9e/prometheus-operator/0.log" Feb 18 12:10:49 crc kubenswrapper[4880]: I0218 12:10:49.056308 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf_2924423e-1b79-4c15-b9b2-cb0d1619ad5c/prometheus-operator-admission-webhook/0.log" Feb 18 12:10:49 crc kubenswrapper[4880]: I0218 12:10:49.141015 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-59dc64659b-m55w8_043f42bf-3e01-42a2-87b4-999205377d66/prometheus-operator-admission-webhook/0.log" Feb 18 12:10:49 crc kubenswrapper[4880]: I0218 12:10:49.251278 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hqkfn_478db717-fcd1-4a34-a2ca-c98b0bda52f7/operator/0.log" Feb 18 12:10:49 crc kubenswrapper[4880]: I0218 12:10:49.357757 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-458b7_ba8852c5-a3ee-43ad-b559-14018ecccf33/perses-operator/0.log" Feb 18 12:10:53 crc kubenswrapper[4880]: I0218 12:10:53.273939 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:10:53 crc kubenswrapper[4880]: I0218 12:10:53.274301 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:11:01 crc kubenswrapper[4880]: I0218 12:11:01.882991 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-5h66c_6ea27bdd-ddfd-4c67-a386-cf768216df4a/kube-rbac-proxy/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.020865 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-5h66c_6ea27bdd-ddfd-4c67-a386-cf768216df4a/controller/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.117411 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-frr-files/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.304888 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-frr-files/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.312550 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-reloader/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.315778 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-reloader/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.315833 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-metrics/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.490527 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-frr-files/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.512134 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-metrics/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.520975 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-reloader/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.570745 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-metrics/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.711015 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-frr-files/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.714074 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-metrics/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.732101 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/cp-reloader/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.755579 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/controller/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.883199 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/frr-metrics/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.957499 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/kube-rbac-proxy-frr/0.log" Feb 18 12:11:02 crc kubenswrapper[4880]: I0218 12:11:02.980713 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/kube-rbac-proxy/0.log" Feb 18 12:11:03 crc kubenswrapper[4880]: I0218 12:11:03.036650 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/frr/0.log" Feb 18 12:11:03 crc kubenswrapper[4880]: I0218 12:11:03.060126 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fhz9j_254a0886-f36a-4066-a0e4-3b35f835d81c/reloader/0.log" Feb 18 12:11:03 crc kubenswrapper[4880]: I0218 12:11:03.210737 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-5fmhh_08be2c36-3a51-4cee-a9aa-f2140d689e95/frr-k8s-webhook-server/0.log" Feb 18 12:11:03 crc kubenswrapper[4880]: I0218 12:11:03.245862 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d4f87c4b6-kq42s_e087df23-40b0-4ce0-bd5c-fc2431364d8d/manager/0.log" Feb 18 12:11:03 crc kubenswrapper[4880]: I0218 12:11:03.378871 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c6d8c54dd-zjnzl_fbb8b5bd-7246-423b-b444-cbb33f9bdad3/webhook-server/0.log" Feb 18 12:11:03 crc kubenswrapper[4880]: I0218 12:11:03.427487 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-n64cr_2c2514b7-fa9f-4be3-8eef-4558bb35093e/kube-rbac-proxy/0.log" Feb 18 12:11:03 crc kubenswrapper[4880]: I0218 12:11:03.667359 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-n64cr_2c2514b7-fa9f-4be3-8eef-4558bb35093e/speaker/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.261905 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl_6937b296-4e20-44b0-ab02-f16a49ac827e/util/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.394265 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl_6937b296-4e20-44b0-ab02-f16a49ac827e/util/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.421634 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl_6937b296-4e20-44b0-ab02-f16a49ac827e/pull/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.468331 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl_6937b296-4e20-44b0-ab02-f16a49ac827e/pull/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.638201 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl_6937b296-4e20-44b0-ab02-f16a49ac827e/extract/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.647465 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl_6937b296-4e20-44b0-ab02-f16a49ac827e/pull/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.654426 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08qmhwl_6937b296-4e20-44b0-ab02-f16a49ac827e/util/0.log" Feb 18 12:11:15 crc kubenswrapper[4880]: I0218 12:11:15.822173 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n_3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d/util/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.007993 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n_3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d/util/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.008339 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n_3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d/pull/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.017752 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n_3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d/pull/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.154829 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n_3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d/util/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.184339 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n_3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d/extract/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.198425 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2138jf6n_3ecb9c2c-8d6c-491e-baf5-d74ab8a2616d/pull/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.334191 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hdgb8_679e3cd9-43cb-4010-93ce-d12efd5fc702/extract-utilities/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.482407 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hdgb8_679e3cd9-43cb-4010-93ce-d12efd5fc702/extract-utilities/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.507160 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hdgb8_679e3cd9-43cb-4010-93ce-d12efd5fc702/extract-content/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.524596 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hdgb8_679e3cd9-43cb-4010-93ce-d12efd5fc702/extract-content/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.719758 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hdgb8_679e3cd9-43cb-4010-93ce-d12efd5fc702/extract-utilities/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.732678 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hdgb8_679e3cd9-43cb-4010-93ce-d12efd5fc702/extract-content/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.919383 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92z9r_3e3fa583-5b51-4f41-9c12-71e0c018be1e/extract-utilities/0.log" Feb 18 12:11:16 crc kubenswrapper[4880]: I0218 12:11:16.953483 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hdgb8_679e3cd9-43cb-4010-93ce-d12efd5fc702/registry-server/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.084973 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92z9r_3e3fa583-5b51-4f41-9c12-71e0c018be1e/extract-utilities/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.106451 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92z9r_3e3fa583-5b51-4f41-9c12-71e0c018be1e/extract-content/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.145932 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92z9r_3e3fa583-5b51-4f41-9c12-71e0c018be1e/extract-content/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.287652 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92z9r_3e3fa583-5b51-4f41-9c12-71e0c018be1e/extract-utilities/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.327286 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92z9r_3e3fa583-5b51-4f41-9c12-71e0c018be1e/extract-content/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.492751 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5_b9452f86-7c6a-45e7-bbe7-7980f815af42/util/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.535775 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92z9r_3e3fa583-5b51-4f41-9c12-71e0c018be1e/registry-server/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.704403 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5_b9452f86-7c6a-45e7-bbe7-7980f815af42/util/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.724405 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5_b9452f86-7c6a-45e7-bbe7-7980f815af42/pull/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.737491 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5_b9452f86-7c6a-45e7-bbe7-7980f815af42/pull/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.922715 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5_b9452f86-7c6a-45e7-bbe7-7980f815af42/extract/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.939655 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5_b9452f86-7c6a-45e7-bbe7-7980f815af42/util/0.log" Feb 18 12:11:17 crc kubenswrapper[4880]: I0218 12:11:17.942470 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecah4zj5_b9452f86-7c6a-45e7-bbe7-7980f815af42/pull/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.083869 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-692kc_bc432e98-ce9f-455b-b3c7-b254dfb4a649/marketplace-operator/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.140724 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2szd_ba267750-47ef-4ac0-ad93-d82731ca5b8b/extract-utilities/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.257120 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2szd_ba267750-47ef-4ac0-ad93-d82731ca5b8b/extract-utilities/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.297421 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2szd_ba267750-47ef-4ac0-ad93-d82731ca5b8b/extract-content/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.313705 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2szd_ba267750-47ef-4ac0-ad93-d82731ca5b8b/extract-content/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.482040 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2szd_ba267750-47ef-4ac0-ad93-d82731ca5b8b/extract-utilities/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.516918 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2szd_ba267750-47ef-4ac0-ad93-d82731ca5b8b/extract-content/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.524813 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2szd_ba267750-47ef-4ac0-ad93-d82731ca5b8b/registry-server/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.651527 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gjvdn_a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64/extract-utilities/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.808929 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gjvdn_a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64/extract-utilities/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.822300 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gjvdn_a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64/extract-content/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.828629 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gjvdn_a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64/extract-content/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.974485 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gjvdn_a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64/extract-utilities/0.log" Feb 18 12:11:18 crc kubenswrapper[4880]: I0218 12:11:18.978895 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gjvdn_a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64/extract-content/0.log" Feb 18 12:11:19 crc kubenswrapper[4880]: I0218 12:11:19.229400 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gjvdn_a04aa5b2-4428-4ad6-bd3d-56ffee4d2c64/registry-server/0.log" Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.274309 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.274826 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.274869 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.275408 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db56eae41c466e7b8fe82658ddc5618a34717bb6d7aa0b464d124a307e8a7668"} pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.275461 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" containerID="cri-o://db56eae41c466e7b8fe82658ddc5618a34717bb6d7aa0b464d124a307e8a7668" gracePeriod=600 Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.641079 4880 generic.go:334] "Generic (PLEG): container finished" podID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerID="db56eae41c466e7b8fe82658ddc5618a34717bb6d7aa0b464d124a307e8a7668" exitCode=0 Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.641114 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerDied","Data":"db56eae41c466e7b8fe82658ddc5618a34717bb6d7aa0b464d124a307e8a7668"} Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.641412 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"3ea0126165533f6361923690ff3b11ae022b1f31de4bce7819f446749e2f2639"} Feb 18 12:11:23 crc kubenswrapper[4880]: I0218 12:11:23.641444 4880 scope.go:117] "RemoveContainer" containerID="d675a88a38ecd91c14a002c1ada584ee97f355276bf8e2925348d67f9f1846e3" Feb 18 12:11:30 crc kubenswrapper[4880]: I0218 12:11:30.192024 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-csbj4_44a09229-68d6-48b2-99c3-cd3d3d9d2d9e/prometheus-operator/0.log" Feb 18 12:11:30 crc kubenswrapper[4880]: I0218 12:11:30.248043 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-59dc64659b-hgcsf_2924423e-1b79-4c15-b9b2-cb0d1619ad5c/prometheus-operator-admission-webhook/0.log" Feb 18 12:11:30 crc kubenswrapper[4880]: I0218 12:11:30.302626 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-59dc64659b-m55w8_043f42bf-3e01-42a2-87b4-999205377d66/prometheus-operator-admission-webhook/0.log" Feb 18 12:11:30 crc kubenswrapper[4880]: I0218 12:11:30.388581 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hqkfn_478db717-fcd1-4a34-a2ca-c98b0bda52f7/operator/0.log" Feb 18 12:11:30 crc kubenswrapper[4880]: I0218 12:11:30.416552 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-458b7_ba8852c5-a3ee-43ad-b559-14018ecccf33/perses-operator/0.log" Feb 18 12:12:28 crc kubenswrapper[4880]: I0218 12:12:28.259771 4880 generic.go:334] "Generic (PLEG): container finished" podID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerID="ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be" exitCode=0 Feb 18 12:12:28 crc kubenswrapper[4880]: I0218 12:12:28.259840 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mwfq/must-gather-92j92" event={"ID":"a76872d3-673b-4433-bc92-39d7aa2af3f0","Type":"ContainerDied","Data":"ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be"} Feb 18 12:12:28 crc kubenswrapper[4880]: I0218 12:12:28.260809 4880 scope.go:117] "RemoveContainer" containerID="ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be" Feb 18 12:12:29 crc kubenswrapper[4880]: I0218 12:12:29.061491 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mwfq_must-gather-92j92_a76872d3-673b-4433-bc92-39d7aa2af3f0/gather/0.log" Feb 18 12:12:35 crc kubenswrapper[4880]: I0218 12:12:35.572033 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5mwfq/must-gather-92j92"] Feb 18 12:12:35 crc kubenswrapper[4880]: I0218 12:12:35.573030 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5mwfq/must-gather-92j92" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerName="copy" containerID="cri-o://acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c" gracePeriod=2 Feb 18 12:12:35 crc kubenswrapper[4880]: I0218 12:12:35.579477 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5mwfq/must-gather-92j92"] Feb 18 12:12:35 crc kubenswrapper[4880]: I0218 12:12:35.943134 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mwfq_must-gather-92j92_a76872d3-673b-4433-bc92-39d7aa2af3f0/copy/0.log" Feb 18 12:12:35 crc kubenswrapper[4880]: I0218 12:12:35.943984 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.093755 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a76872d3-673b-4433-bc92-39d7aa2af3f0-must-gather-output\") pod \"a76872d3-673b-4433-bc92-39d7aa2af3f0\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.106438 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjnbn\" (UniqueName: \"kubernetes.io/projected/a76872d3-673b-4433-bc92-39d7aa2af3f0-kube-api-access-fjnbn\") pod \"a76872d3-673b-4433-bc92-39d7aa2af3f0\" (UID: \"a76872d3-673b-4433-bc92-39d7aa2af3f0\") " Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.111979 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76872d3-673b-4433-bc92-39d7aa2af3f0-kube-api-access-fjnbn" (OuterVolumeSpecName: "kube-api-access-fjnbn") pod "a76872d3-673b-4433-bc92-39d7aa2af3f0" (UID: "a76872d3-673b-4433-bc92-39d7aa2af3f0"). InnerVolumeSpecName "kube-api-access-fjnbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.209194 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76872d3-673b-4433-bc92-39d7aa2af3f0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a76872d3-673b-4433-bc92-39d7aa2af3f0" (UID: "a76872d3-673b-4433-bc92-39d7aa2af3f0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.210358 4880 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a76872d3-673b-4433-bc92-39d7aa2af3f0-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.210379 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjnbn\" (UniqueName: \"kubernetes.io/projected/a76872d3-673b-4433-bc92-39d7aa2af3f0-kube-api-access-fjnbn\") on node \"crc\" DevicePath \"\"" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.314249 4880 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mwfq_must-gather-92j92_a76872d3-673b-4433-bc92-39d7aa2af3f0/copy/0.log" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.314873 4880 generic.go:334] "Generic (PLEG): container finished" podID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerID="acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c" exitCode=143 Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.314916 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mwfq/must-gather-92j92" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.315025 4880 scope.go:117] "RemoveContainer" containerID="acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.338010 4880 scope.go:117] "RemoveContainer" containerID="ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.409254 4880 scope.go:117] "RemoveContainer" containerID="acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c" Feb 18 12:12:36 crc kubenswrapper[4880]: E0218 12:12:36.410860 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c\": container with ID starting with acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c not found: ID does not exist" containerID="acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.410926 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c"} err="failed to get container status \"acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c\": rpc error: code = NotFound desc = could not find container \"acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c\": container with ID starting with acae148701553913caeb438ebcdc622da8bd5efe84895f6c4c4d1510809a8d3c not found: ID does not exist" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.410959 4880 scope.go:117] "RemoveContainer" containerID="ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be" Feb 18 12:12:36 crc kubenswrapper[4880]: E0218 12:12:36.411415 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be\": container with ID starting with ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be not found: ID does not exist" containerID="ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be" Feb 18 12:12:36 crc kubenswrapper[4880]: I0218 12:12:36.411456 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be"} err="failed to get container status \"ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be\": rpc error: code = NotFound desc = could not find container \"ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be\": container with ID starting with ebf74bc3739978d4266bdf1ab831204184511d5c793bff45195f3d0212a9a9be not found: ID does not exist" Feb 18 12:12:37 crc kubenswrapper[4880]: I0218 12:12:37.189174 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" path="/var/lib/kubelet/pods/a76872d3-673b-4433-bc92-39d7aa2af3f0/volumes" Feb 18 12:13:23 crc kubenswrapper[4880]: I0218 12:13:23.274357 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:13:23 crc kubenswrapper[4880]: I0218 12:13:23.274934 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:13:53 crc kubenswrapper[4880]: I0218 12:13:53.274286 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:13:53 crc kubenswrapper[4880]: I0218 12:13:53.274848 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:14:23 crc kubenswrapper[4880]: I0218 12:14:23.273757 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:14:23 crc kubenswrapper[4880]: I0218 12:14:23.274321 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 12:14:23 crc kubenswrapper[4880]: I0218 12:14:23.274369 4880 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" Feb 18 12:14:23 crc kubenswrapper[4880]: I0218 12:14:23.274910 4880 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ea0126165533f6361923690ff3b11ae022b1f31de4bce7819f446749e2f2639"} pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 12:14:23 crc kubenswrapper[4880]: I0218 12:14:23.274991 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" containerID="cri-o://3ea0126165533f6361923690ff3b11ae022b1f31de4bce7819f446749e2f2639" gracePeriod=600 Feb 18 12:14:24 crc kubenswrapper[4880]: I0218 12:14:24.062644 4880 generic.go:334] "Generic (PLEG): container finished" podID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerID="3ea0126165533f6361923690ff3b11ae022b1f31de4bce7819f446749e2f2639" exitCode=0 Feb 18 12:14:24 crc kubenswrapper[4880]: I0218 12:14:24.062636 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerDied","Data":"3ea0126165533f6361923690ff3b11ae022b1f31de4bce7819f446749e2f2639"} Feb 18 12:14:24 crc kubenswrapper[4880]: I0218 12:14:24.063100 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" event={"ID":"bf5fee6a-c0f1-43c5-8991-cc078ccb904d","Type":"ContainerStarted","Data":"a44ee39c1aeedc5779e44c5b323948d6f3f19aa5ecc57e41debf456f787e41e2"} Feb 18 12:14:24 crc kubenswrapper[4880]: I0218 12:14:24.063133 4880 scope.go:117] "RemoveContainer" containerID="db56eae41c466e7b8fe82658ddc5618a34717bb6d7aa0b464d124a307e8a7668" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.147829 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd"] Feb 18 12:15:00 crc kubenswrapper[4880]: E0218 12:15:00.148600 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerName="gather" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.148632 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerName="gather" Feb 18 12:15:00 crc kubenswrapper[4880]: E0218 12:15:00.148645 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerName="copy" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.148650 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerName="copy" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.148849 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerName="copy" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.148881 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76872d3-673b-4433-bc92-39d7aa2af3f0" containerName="gather" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.149426 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.151922 4880 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.152498 4880 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.182566 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b73e8bb2-e408-4ff0-8351-b732b1f20693-config-volume\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.182717 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47q6z\" (UniqueName: \"kubernetes.io/projected/b73e8bb2-e408-4ff0-8351-b732b1f20693-kube-api-access-47q6z\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.182751 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b73e8bb2-e408-4ff0-8351-b732b1f20693-secret-volume\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.186899 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd"] Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.283977 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47q6z\" (UniqueName: \"kubernetes.io/projected/b73e8bb2-e408-4ff0-8351-b732b1f20693-kube-api-access-47q6z\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.284023 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b73e8bb2-e408-4ff0-8351-b732b1f20693-secret-volume\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.284056 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b73e8bb2-e408-4ff0-8351-b732b1f20693-config-volume\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.285009 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b73e8bb2-e408-4ff0-8351-b732b1f20693-config-volume\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.292905 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b73e8bb2-e408-4ff0-8351-b732b1f20693-secret-volume\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.309338 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47q6z\" (UniqueName: \"kubernetes.io/projected/b73e8bb2-e408-4ff0-8351-b732b1f20693-kube-api-access-47q6z\") pod \"collect-profiles-29523615-7bzpd\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.470647 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:00 crc kubenswrapper[4880]: I0218 12:15:00.868752 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd"] Feb 18 12:15:01 crc kubenswrapper[4880]: I0218 12:15:01.304527 4880 generic.go:334] "Generic (PLEG): container finished" podID="b73e8bb2-e408-4ff0-8351-b732b1f20693" containerID="b7475e4547ecf46c422d0c80972b41ca33d47c2c6ce7aad3f9e9292246c66c5f" exitCode=0 Feb 18 12:15:01 crc kubenswrapper[4880]: I0218 12:15:01.304573 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" event={"ID":"b73e8bb2-e408-4ff0-8351-b732b1f20693","Type":"ContainerDied","Data":"b7475e4547ecf46c422d0c80972b41ca33d47c2c6ce7aad3f9e9292246c66c5f"} Feb 18 12:15:01 crc kubenswrapper[4880]: I0218 12:15:01.304602 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" event={"ID":"b73e8bb2-e408-4ff0-8351-b732b1f20693","Type":"ContainerStarted","Data":"257a4d4c459ebd153a0fa9f40718dec182937d23cbbc8be0634caf70842c4c11"} Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.569784 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.714144 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47q6z\" (UniqueName: \"kubernetes.io/projected/b73e8bb2-e408-4ff0-8351-b732b1f20693-kube-api-access-47q6z\") pod \"b73e8bb2-e408-4ff0-8351-b732b1f20693\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.714232 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b73e8bb2-e408-4ff0-8351-b732b1f20693-config-volume\") pod \"b73e8bb2-e408-4ff0-8351-b732b1f20693\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.714265 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b73e8bb2-e408-4ff0-8351-b732b1f20693-secret-volume\") pod \"b73e8bb2-e408-4ff0-8351-b732b1f20693\" (UID: \"b73e8bb2-e408-4ff0-8351-b732b1f20693\") " Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.715535 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73e8bb2-e408-4ff0-8351-b732b1f20693-config-volume" (OuterVolumeSpecName: "config-volume") pod "b73e8bb2-e408-4ff0-8351-b732b1f20693" (UID: "b73e8bb2-e408-4ff0-8351-b732b1f20693"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.720327 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73e8bb2-e408-4ff0-8351-b732b1f20693-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b73e8bb2-e408-4ff0-8351-b732b1f20693" (UID: "b73e8bb2-e408-4ff0-8351-b732b1f20693"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.720660 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73e8bb2-e408-4ff0-8351-b732b1f20693-kube-api-access-47q6z" (OuterVolumeSpecName: "kube-api-access-47q6z") pod "b73e8bb2-e408-4ff0-8351-b732b1f20693" (UID: "b73e8bb2-e408-4ff0-8351-b732b1f20693"). InnerVolumeSpecName "kube-api-access-47q6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.815493 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47q6z\" (UniqueName: \"kubernetes.io/projected/b73e8bb2-e408-4ff0-8351-b732b1f20693-kube-api-access-47q6z\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.815520 4880 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b73e8bb2-e408-4ff0-8351-b732b1f20693-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:02 crc kubenswrapper[4880]: I0218 12:15:02.815530 4880 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b73e8bb2-e408-4ff0-8351-b732b1f20693-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:03 crc kubenswrapper[4880]: I0218 12:15:03.319069 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" event={"ID":"b73e8bb2-e408-4ff0-8351-b732b1f20693","Type":"ContainerDied","Data":"257a4d4c459ebd153a0fa9f40718dec182937d23cbbc8be0634caf70842c4c11"} Feb 18 12:15:03 crc kubenswrapper[4880]: I0218 12:15:03.319114 4880 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="257a4d4c459ebd153a0fa9f40718dec182937d23cbbc8be0634caf70842c4c11" Feb 18 12:15:03 crc kubenswrapper[4880]: I0218 12:15:03.319144 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523615-7bzpd" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.021543 4880 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s5nx8"] Feb 18 12:15:07 crc kubenswrapper[4880]: E0218 12:15:07.022305 4880 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73e8bb2-e408-4ff0-8351-b732b1f20693" containerName="collect-profiles" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.022321 4880 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e8bb2-e408-4ff0-8351-b732b1f20693" containerName="collect-profiles" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.022487 4880 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73e8bb2-e408-4ff0-8351-b732b1f20693" containerName="collect-profiles" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.028228 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.036644 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s5nx8"] Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.173867 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-utilities\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.174210 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-catalog-content\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.174491 4880 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76c6\" (UniqueName: \"kubernetes.io/projected/905d3da6-d400-4e27-ba81-6bdc18aec5c9-kube-api-access-t76c6\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.276467 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-utilities\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.276589 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-catalog-content\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.276724 4880 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76c6\" (UniqueName: \"kubernetes.io/projected/905d3da6-d400-4e27-ba81-6bdc18aec5c9-kube-api-access-t76c6\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.277879 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-catalog-content\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.277963 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-utilities\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.296954 4880 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76c6\" (UniqueName: \"kubernetes.io/projected/905d3da6-d400-4e27-ba81-6bdc18aec5c9-kube-api-access-t76c6\") pod \"redhat-operators-s5nx8\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.345775 4880 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:07 crc kubenswrapper[4880]: I0218 12:15:07.785655 4880 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s5nx8"] Feb 18 12:15:08 crc kubenswrapper[4880]: I0218 12:15:08.349441 4880 generic.go:334] "Generic (PLEG): container finished" podID="905d3da6-d400-4e27-ba81-6bdc18aec5c9" containerID="bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193" exitCode=0 Feb 18 12:15:08 crc kubenswrapper[4880]: I0218 12:15:08.349698 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nx8" event={"ID":"905d3da6-d400-4e27-ba81-6bdc18aec5c9","Type":"ContainerDied","Data":"bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193"} Feb 18 12:15:08 crc kubenswrapper[4880]: I0218 12:15:08.349723 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nx8" event={"ID":"905d3da6-d400-4e27-ba81-6bdc18aec5c9","Type":"ContainerStarted","Data":"4d5c59c683e9f9d69fc00bdb98b40ec00f7eefc0b65af4322cba56398caf1b6f"} Feb 18 12:15:08 crc kubenswrapper[4880]: I0218 12:15:08.351581 4880 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 12:15:11 crc kubenswrapper[4880]: I0218 12:15:11.371319 4880 generic.go:334] "Generic (PLEG): container finished" podID="905d3da6-d400-4e27-ba81-6bdc18aec5c9" containerID="4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859" exitCode=0 Feb 18 12:15:11 crc kubenswrapper[4880]: I0218 12:15:11.371444 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nx8" event={"ID":"905d3da6-d400-4e27-ba81-6bdc18aec5c9","Type":"ContainerDied","Data":"4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859"} Feb 18 12:15:12 crc kubenswrapper[4880]: I0218 12:15:12.385503 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nx8" event={"ID":"905d3da6-d400-4e27-ba81-6bdc18aec5c9","Type":"ContainerStarted","Data":"a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0"} Feb 18 12:15:12 crc kubenswrapper[4880]: I0218 12:15:12.411451 4880 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s5nx8" podStartSLOduration=1.85379139 podStartE2EDuration="5.411412366s" podCreationTimestamp="2026-02-18 12:15:07 +0000 UTC" firstStartedPulling="2026-02-18 12:15:08.35136736 +0000 UTC m=+1415.780268221" lastFinishedPulling="2026-02-18 12:15:11.908988336 +0000 UTC m=+1419.337889197" observedRunningTime="2026-02-18 12:15:12.405525744 +0000 UTC m=+1419.834426605" watchObservedRunningTime="2026-02-18 12:15:12.411412366 +0000 UTC m=+1419.840313237" Feb 18 12:15:17 crc kubenswrapper[4880]: I0218 12:15:17.346109 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:17 crc kubenswrapper[4880]: I0218 12:15:17.346496 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:18 crc kubenswrapper[4880]: I0218 12:15:18.386540 4880 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s5nx8" podUID="905d3da6-d400-4e27-ba81-6bdc18aec5c9" containerName="registry-server" probeResult="failure" output=< Feb 18 12:15:18 crc kubenswrapper[4880]: timeout: failed to connect service ":50051" within 1s Feb 18 12:15:18 crc kubenswrapper[4880]: > Feb 18 12:15:27 crc kubenswrapper[4880]: I0218 12:15:27.391932 4880 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:27 crc kubenswrapper[4880]: I0218 12:15:27.438024 4880 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:27 crc kubenswrapper[4880]: I0218 12:15:27.629058 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s5nx8"] Feb 18 12:15:28 crc kubenswrapper[4880]: I0218 12:15:28.500222 4880 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s5nx8" podUID="905d3da6-d400-4e27-ba81-6bdc18aec5c9" containerName="registry-server" containerID="cri-o://a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0" gracePeriod=2 Feb 18 12:15:28 crc kubenswrapper[4880]: I0218 12:15:28.871089 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:28 crc kubenswrapper[4880]: I0218 12:15:28.994512 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t76c6\" (UniqueName: \"kubernetes.io/projected/905d3da6-d400-4e27-ba81-6bdc18aec5c9-kube-api-access-t76c6\") pod \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " Feb 18 12:15:28 crc kubenswrapper[4880]: I0218 12:15:28.994625 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-utilities\") pod \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " Feb 18 12:15:28 crc kubenswrapper[4880]: I0218 12:15:28.994652 4880 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-catalog-content\") pod \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\" (UID: \"905d3da6-d400-4e27-ba81-6bdc18aec5c9\") " Feb 18 12:15:28 crc kubenswrapper[4880]: I0218 12:15:28.995877 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-utilities" (OuterVolumeSpecName: "utilities") pod "905d3da6-d400-4e27-ba81-6bdc18aec5c9" (UID: "905d3da6-d400-4e27-ba81-6bdc18aec5c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.000385 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905d3da6-d400-4e27-ba81-6bdc18aec5c9-kube-api-access-t76c6" (OuterVolumeSpecName: "kube-api-access-t76c6") pod "905d3da6-d400-4e27-ba81-6bdc18aec5c9" (UID: "905d3da6-d400-4e27-ba81-6bdc18aec5c9"). InnerVolumeSpecName "kube-api-access-t76c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.096123 4880 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t76c6\" (UniqueName: \"kubernetes.io/projected/905d3da6-d400-4e27-ba81-6bdc18aec5c9-kube-api-access-t76c6\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.096543 4880 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.118676 4880 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "905d3da6-d400-4e27-ba81-6bdc18aec5c9" (UID: "905d3da6-d400-4e27-ba81-6bdc18aec5c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.198015 4880 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905d3da6-d400-4e27-ba81-6bdc18aec5c9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.508248 4880 generic.go:334] "Generic (PLEG): container finished" podID="905d3da6-d400-4e27-ba81-6bdc18aec5c9" containerID="a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0" exitCode=0 Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.508285 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nx8" event={"ID":"905d3da6-d400-4e27-ba81-6bdc18aec5c9","Type":"ContainerDied","Data":"a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0"} Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.508310 4880 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nx8" event={"ID":"905d3da6-d400-4e27-ba81-6bdc18aec5c9","Type":"ContainerDied","Data":"4d5c59c683e9f9d69fc00bdb98b40ec00f7eefc0b65af4322cba56398caf1b6f"} Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.508307 4880 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nx8" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.508338 4880 scope.go:117] "RemoveContainer" containerID="a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.531570 4880 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s5nx8"] Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.538188 4880 scope.go:117] "RemoveContainer" containerID="4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.539829 4880 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s5nx8"] Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.563795 4880 scope.go:117] "RemoveContainer" containerID="bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.585633 4880 scope.go:117] "RemoveContainer" containerID="a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0" Feb 18 12:15:29 crc kubenswrapper[4880]: E0218 12:15:29.586066 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0\": container with ID starting with a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0 not found: ID does not exist" containerID="a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.586100 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0"} err="failed to get container status \"a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0\": rpc error: code = NotFound desc = could not find container \"a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0\": container with ID starting with a5012e5562c49b251f27ce67238cdf8179e79b6441e397f330de41011e71a4c0 not found: ID does not exist" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.586126 4880 scope.go:117] "RemoveContainer" containerID="4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859" Feb 18 12:15:29 crc kubenswrapper[4880]: E0218 12:15:29.586893 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859\": container with ID starting with 4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859 not found: ID does not exist" containerID="4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.586941 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859"} err="failed to get container status \"4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859\": rpc error: code = NotFound desc = could not find container \"4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859\": container with ID starting with 4503221d0aee7ec7c73f1527bd918af645c5a40ad5ffe012b5fa891dd4330859 not found: ID does not exist" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.586975 4880 scope.go:117] "RemoveContainer" containerID="bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193" Feb 18 12:15:29 crc kubenswrapper[4880]: E0218 12:15:29.587319 4880 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193\": container with ID starting with bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193 not found: ID does not exist" containerID="bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193" Feb 18 12:15:29 crc kubenswrapper[4880]: I0218 12:15:29.587348 4880 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193"} err="failed to get container status \"bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193\": rpc error: code = NotFound desc = could not find container \"bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193\": container with ID starting with bf7297ce9d3d35be24596e3dea639ea66e5579edd5081b3ee1f5f9db564c9193 not found: ID does not exist" Feb 18 12:15:31 crc kubenswrapper[4880]: I0218 12:15:31.193184 4880 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905d3da6-d400-4e27-ba81-6bdc18aec5c9" path="/var/lib/kubelet/pods/905d3da6-d400-4e27-ba81-6bdc18aec5c9/volumes" Feb 18 12:16:23 crc kubenswrapper[4880]: I0218 12:16:23.274502 4880 patch_prober.go:28] interesting pod/machine-config-daemon-c8jsp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 12:16:23 crc kubenswrapper[4880]: I0218 12:16:23.275109 4880 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c8jsp" podUID="bf5fee6a-c0f1-43c5-8991-cc078ccb904d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145326653024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145326654017375 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145323331016505 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145323331015455 5ustar corecore